Using copy from a current site on a new one
-
I have a client who is closing down his local business because he'smoving to another state. When he gets there he will launch a new website.On his current website, he put in a lot of work and has a ton of good copy, including blog posts that have helped gain him excellent rankings.He's asking me if he can use that copy on his new site and get original author credit for that, like he did on his current site.Can he use the same copy from his current website on his new websitewithout any problems — and get original author credit for it?Would it be best to shut down the old site or to 301 all of the pages beingmoved to the new corresponding pages?If 301's are the way to go, how long should he leave those in place?Thanks!Kirk
-
Thanks!
-
Hello Kirk,
As long as you point the root domain, all should be well. I went through the process a few months ago with a clients websites; no problems were encountered.
I've collated the articles I found useful prior to the process I went through.
-
Ta, with the old site going down, Joe's and Egol's advice is spot on. All straight forward.
Hope that helps.
-
Hi Joe, I apologize for the slow delay. I did not get any notifications of replies to my question.
Does your suggestion apply even if the old website will be taken down? (Which is the case)
thank you, Kirk
-
Thank you for this info!
-
Hi Don, The old site will be taken down. (I apologize for the slow delay. I did not get any notifications of replies to my question.)
-
If a thorough job of using 301s to redirect the site is done, as Joe Viveiros suggested, and those 301s remain in place forever, then all content can be safely moved and all link equity should follow. It will take a while for Google to figure this out, and possibly a lot longer for Google to appreciate the original author credit, but everything should be fine in a few to several months.
-
Hi
What is not clear is - is what is happening to the original site. Is the original site staying up? If so it is different advice as to a new site and simply transferring content.
Can you clarify?
-
By all means, use the copy especially if you're ranking well for it. I'd recommend:
- Creating 301s
- Updating the robots.txt file, after Go Live with the Disallow / command
- If you have access to the old version of GSC then you can repoint the whole domain over - old to new URL
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there an advantage to using rel=canonical rather than noindex on pages on my mobile site (m.company.com)?
Is there an advantage to using link rel=alternate (as recommended by Google) rather than noindex on pages on my mobile site (m.company.com)? The content on the mobile pages is very similar to the content on the desktop site. I see Google recommends canonical and alternate tags, but what are the benefits of using those rather than noindex?
Intermediate & Advanced SEO | | jennifer.new0 -
Site Migration of 4 sites into 1?
Hi Guys, I have a massive project involving a migration of 4 sites into 1. 4 sites include: **www.MainSite.com ** www.E-commerce.com www.Membership.com www.ResearchStudy.com Goal of this project is to have 1-4 regrouped into Main Site I will be following the best practice from this post https://moz.com/blog/web-site-migration-guide-tips-for-seos which has an awesome checklist. I am actually about to start Phase 3: URL redirect mapping. Because all of these sites have hundreds of duplicates, I figured I should first resolve the Main Site dup issues before creating the URL redirect mapping but what about the other domains (2,3,4) though? Should I first resolve the Dup issues on those ones as well or it is not necessary since they will be pointing into the Main Site new domain? I want to make sure I don't overwork the programming team and myself. Thanks For sharing your expertise and any tips on how should I move forward with this.
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Penguin recovery, no manual action. Are our EMD sites killing our brand site?
Hi guys, Our brand site (http://urban3d.net) has been seeing steady decline due to algorithm updates for the past two years. Our previous SEO company engaged in some black-hat link building which has hurt us very badly. We have recently re-launched the site, with better design, better content, and completed a disavow of hundreds of bad links. The site is technically indexed, but is still nowhere in the SERPs after months of work to recover it by our internal marketing team. The last SEO company also told us to build EMD sites for our core services, which we did: http://3dvisualisation.co.uk/ http://propertybrochure.com/ http://kitchencgi.com/ My question is - could these EMD sites now hurting us even further and stopping our main brand site from ranking? Our plan is to rescue our brand site, with a view to retiring these outlier sites. However, with no progress on the brand site, we can't afford to remove these site (which are ranking). It seems a bit chicken and egg. Any advice would be very much appreciated. Aidan, Urban 3D
Intermediate & Advanced SEO | | aidancass0 -
What to do about old urls that don't logically 301 redirect to current site?
Mozzers, I have changed my site url structure several times. As a result, I now have a lot of old URLs that don't really logically redirect to anything in the current site. I started out 404-ing them, but it seemed like Google was penalizing my crawl rate AND it wasn't removing them from the index after being crawled several times. There are way too many (>100k) to use the URL removal tool even at a directory level. So instead I took some advice and changed them to 200, but with a "noindex" meta tag and set them to not render any content. I get less errors but I now have a lot of pages that do this. Should I (a) just 404 them and wait for Google to remove (b) keep the 200, noindex or (c) are there other things I can do? 410 maybe? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
If your site has used www for the last few years is it a good idea to direct to non www?
I had a website moved to wpengine lately and when the developer set it up on wpengine he used http://example.com/. For the last few years the site has been directing to http://www.example.com/ Should I redirect the site to http://www.example.com/ to avoid losing indexed pages and google rankings?
Intermediate & Advanced SEO | | webestate0 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
Block all but one URL in a directory using robots.txt?
Is it possible to block all but one URL with robots.txt? for example domain.com/subfolder/example.html, if we block the /subfolder/ directory we want all URLs except for the exact match url domain.com/subfolder to be blocked.
Intermediate & Advanced SEO | | nicole.healthline0