Moving Content To Another Website With No Redirect?
-
I've got a website that has lots of valuable content and tools but it's been hit too hard by both Panda and Penguin. I came to the conclusion that I'd be better off with a new website as this one is going to hell no matter how much time and money I put in it. Had I started a new website the first time it got hit by Penguin, I'd be profitable today.
I'd like to move some of that content to this other domain but I don't want to do 301 redirects as I don't want to pass bad link juice. I know I'll lose all links and visitors to the original website but I don't care.
My only concern is duplicate content. I was thinking of setting the pages to noindex on the original website and wait until they don't appear in Google's index. Then I'd move them over to the new domain to be indexed again.
Do you see any problem with this? Should I rewrite everything instead? I hate spinning content...!
-
If we're understanding the situation correctly, I'd say this sums it up pretty well.
-
It sounds to me as though most of the content from old site is staying but that 3 enigmatic 'tools' are being moved to a new domain.
In which case I would want to be sure that the functionality being moved wasn't the cause of the previously lifted penalty, especially from a Panda perspective (given that the tools on the new domain presumably won't have any links pointing to it, Penguin shouldn't be an issue) - as a penalty would be re-applied if the tools are not Panda-friendly.
So:
- if you want to have the tools on both sites, I'm with Pete - noindex the tools on the old site.
- if you are permanently moving the tools, review them for Panda-friendliness and then noindex the old site's URLs, probably worth blocking the old URL in robots.txt as well.
- If your previous penalty was nothing to do with the tools at all, and the link profile of those pages is good (or if there aren't any links) then 301 the old URLs to the new.
That's if between Pete and myself we've understood correctly what you're trying to achieve.
Good Luck!
-
So, I'm confused - are you looking to keep both sites active? If you're just moving the tools to a new domain, you could NOINDEX the old pages. If the link-based penalty isn't too severe, you might try a cross-domain rel=canonical on the old site. Unfortunately, without understanding the penalty profile, it's a bit tricky to advise. It's really a cost/benefit trade-off - how much risk of carrying the penalty are you willing to accept vs. the alternative of cutting off all authority and starting over on the new site.
If you've had Panda-related problems, though, I wouldn't keep the tools crawlable on both sites. That seems more likely to prolong your problems than it is to solve them.
-
In fact, I am not moving any content from the old website to the new one. It's just 3 online tools that I wanted to keep for the new website. They both have different content though but the functionalities are the same. I've "noindex" the tools on the old website.
By the way, the manual penalty has been revoked on the old website a few weeks ago.
-
I tend to agree with Martin - it seems like there's probably a way to preserve some of the power of the old site and 301-redirect selectively (or potentially use cross-domain rel=canonical tags), but it would take a much deeper understanding of the site than Q&A allows.
If you rebuild the site from scratch, you'd almost always want to de-index the old site. I'd flat out remove it via Google Webmaster Tools - it's the fastest method. Leaving both sites crawlable is only going to compound your problems and haunt the new site.
I'd warn, though, that if this is Panda-related, just moving the content won't solve your problems. You do have to sort out why they happened in the first place, or the same algorithmic issues will just come back. In other words, if the problems are content-related, then it doesn't really matter where the content lives. If the problems are link related, then moving will remove the problems. Of course, moving will also remove and advantages you currently have based on good links.
Unfortunately, this isn't a problem that can be addressed without a pretty deep audit. My gut feeling is that there may be a way to preserve some of the authority of the old site, but you really need to pin down the problems. Panda + Penguin is a wide swath of potential problems and just isn't enough information to do this right.
-
Some of this "content" are in fact online tools and the tutorials that accompanies it.
-
Hi Stephane,
All the below assumes you feel there is some value in keeping the original website live at all.
My first reaction would be to do a full review of all your old content and carefully consider which ones may have been hit by Panda - is there keyword stuffing, content duplicated from other sites, thin content...etc? Then either fix or completely rewrite those.
After that you should avoid publishing duplicated content so my view would be
1. Remove the rewritten/fixed articles completely from the old site
2. Don't implement the 301 so you don't get any redirected bad Penguin vibe
3. Put a block on those URLs using robots.txt
4. Remove the URLs from Google's index in Webmaster ToolsThen you are free to publish your new, Panda-friendly content to your new website.
Not sure what other mozzers would say, but that's my view. This is not about 'spinning content' but removing poor content and republishing great content. Hope it makes sense.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO on dynamic website
Hi. I am hoping you can advise. I have a client in one of my training groups and their site is a golf booking engine where all pages are dynamically created based on parameters used in their website search. They want to know what is the best thing to do for SEO. They have some landing pages that Google can see but there is only a small bit of text at the top and the rest of the page is dynamically created. I have advised that they should create landing pages for each of their locations and clubs and use canonicals to handle what Google indexes.Is this the right advice or should they noindex? Thanks S
Intermediate & Advanced SEO | | bedynamic0 -
How to find the redirects on website
I want to find the complete internal redirects on website. Just internally linked. How to find such?
Intermediate & Advanced SEO | | vtmoz0 -
Putting my content under domain.com/content, or under related categories: domain.com/bikes/content ?
Hello This questions plays on what Joe Hall talked about during this years' MozCon: Rethinking Information Architecture for SEO and Content Marketing. My Case:
Intermediate & Advanced SEO | | Inevo
So.. we're working out guidelines and templates for a costumer (sporting goods store) on how to publish content (articles, videos, guides) on their category pages, product pages, and other pages. At this moment I have 2 choices:
1. Use a url-structure/information architecture where all the content is placed in one subfolder, for example domain.com/content. Although it's placed here, there's gonna be extensive internal linking from /content to the related category pages, so the content about bikes (even if it's placed under domain.com/bikes) will be just as visible on the pages related to bikes. 2. Place the content about bikes on a subdirectory under the bike category, **for example domain.com/bikes/content. ** The UX/interface for these two scenarios will be identical, but the directories/folder-hierarchy/url structure will be different. According to Joe Hall, the latter scenario will build up more topical authority and relevance towards the category/topic, and should be the overall most ideal setup. Any thoughts on which of the two solutions is the most ideal? PS: There is one critical caveat her: my costumer uses many url-slugs subdirectories for their categories, for example domain.com/activity/summer/bikes/, which means the content in the first scenario will be 4 steps away from the home page. Is this gonna be a problem? Looking forward to your thoughts 🙂 Sigurd, INEVO0 -
Base copy on 1 page, then adding a bit more for another page - potential duplicate content. What to do?
Hi all, We're creating a section for a client that is based on road trips - for example, New York to Toronto. We have a 3 day trip, a 5 day trip, a 7 day trip and a 10 day trip. The 3 day trip is the base, and then for the 5 day trip, we add another couple of stops, for the 7 day trip, we add a couple more stops and then for the 10 day trip, there might be two or three times the number of stops of the initial 3 day trip. However, the base content is similar - you start at New York, you finish in Toronto, you likely go through Niagara on all trips. It's not exact duplicate content, but it's similar content. I'm not sure how to look after it? The thoughts we have are:1) Use canonical tags 3,5,7 day trips to the 10 day trip.
Intermediate & Advanced SEO | | digitalhothouse
2) It's not exactly duplicate content, so just go with the content as it is We don't want to get hit by any penalty for duplicate content so just want to work out what you guys think is the best way to go about this. Thanks in advance!0 -
Blog tags are creating excessive duplicate content...should we use rel canonicals or 301 redirects?
We are having an issue with our cilent's blog creating excessive duplicate content via blog tags. The duplicate webpages from tags offer absolutely no value (we can't even see the tag). Should we just 301 redirect the tagged page or use a rel canonical?
Intermediate & Advanced SEO | | VanguardCommunications0 -
Duplicate content on subdomains.
Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks!
Intermediate & Advanced SEO | | HiteshBharucha0 -
Redirecting, then redirecting back
Hey, mozzers! My first question ever... I have a client who has (fictitionally) WickerPatioHomeStore.com, which features wicker home decor. Not too long ago, they wanted a shorter, easier URL, so they redirected to another domain they own, WickerPatio.com (again, fictional). They saw somewhat of a drop in traffic, and wonder if there's a correlation with the words "home store" not being in their domain any more. When considering the two, I figure that relevant factors would be age of domains, history of content of the domains, and inbound links to each domain. Any thoughts on other things to consider? Thanks very much!! ~ Scott
Intermediate & Advanced SEO | | GRIP-SEO0 -
Pages with Little Content
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed. I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed. Here is an example page with very little content
Intermediate & Advanced SEO | | andywozhere0