Redirecting external blog to main website blog - two questions I'm struggling with
-
Hiya Mozzers - I have a blog separate from main website which is duplicating the blog on the main website. This separate blog is duplicating the main website blog, so it needs to be closed down and redirected. There are some 200 pages of separate blog (identified via Screaming Frog).
So I am suggesting the blog pages should be 301 redirected from the separate blog to the equivalent blog pages on the main website.
Q1) Should the root domain of the separate blog be 301 redirected to the main www.mainwebsite.com/blog page, or to the root domain of the main website?
Q2) Should the blog pages, which lack equivalent content on the main website, be 301 redirected to the main www.mainwebsite.com/blog page, or to the root domain of the main website?
Thanks for your help on this one, Luke
-
Hi Luke,
As alrockn says, I would redirect the root domain of the duplicate blog to mainsite.com/blog.
With the pages that have no duplicate on the main site, either find the likeliest match or redirect to mainsite.com/blog. When I say likeliest match, I mean that if any pages on the old blog seem a good fit with something on the main site, redirect them there. You may find none that fit any of the existing pages. If so, don't worry - I'd redirect those to the main site's blog.
Cheers,
Jane
-
Thanks David - I was wondering whether blog root domain should go to the main website's blog page (www.mainwebsite.com/blog) or homepage (www.mainwebsite.com/) - not sure that it matters one way or the other as never had to deal with this kind of issue before.
-
Option 1 is a very good idea if you think these links will hurt you, Google wise. Since you says these are all duplicate content, this is probably what you want to do, but first be sure you don't want the link or authority juice from the old blog. If the blog gets more traffic, has more authority, etc...the decision gets tougher to make.
-
You have 2 options:
1. You can turn off the duplicated site, and do a URL removal request in Webmaster tools. The domain you do not want anymore, or the one that is causing the duplication must return a 404 error in order for Google to accept the request. This will cause the entire old site to go away, and the links to be removed from Google's index. If the site does not have a lot of traffic, or does not get a lot of visits, this would be the best route.
2. Do 301 redirects on the duplicated site. You should try and match them up, such as:
olddomain.com/your-blog-post
301 redirects to
NEWdomain.com/your-blog-postIf they do not match up exactly, use best judgement and redirect them either to the home page, or a similar subject matter. For example if you had an seo related topic that wasnt exact match, you could forward it to another article that referenced seo. If you have to send a few to the home page, that wont hurt anything, just dont make the total number excessive, as that could potentially look spammy. Best of luck, If you need any help let me know!
-
You should try to mirror the 301's to the duplicates on the main site. So in answer to your first question, the duplicate blog index page should go to main/blog, and everything else should go to the new location on the main site. Anything without landing pages on the new site, should redirect to main/blog.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New blog on a separate server to the main website?
We have a potential client who operates a jobs board in a niche sector in the UK. They want to start a blog but don't want to set it up on the same server as the main jobs site. Discussion started around Wordpress, and their preference is for the WP.com hosted version in a directory or subdomain of the TLD. Our concerns are around the different locations of the two sites (impact of two diff server locations and IP addresses?) but also the limitation of WP.com to interlink the two sites enough that they provide a decent customer experience. Thoughts, musings, advice - all welcome! Tks
Intermediate & Advanced SEO | | AB-Marketing0 -
Would You Redirect a Page if the Parent Page was Redirected?
Hi everyone! Let's use this as an example URL: https://www.example.com/marvel/avengers/hulk/ We have done a 301 redirect for the "Avengers" page to another page on the site. Sibling pages of the "Hulk" page live off "marvel" now (ex: /marvel/thor/ and /marvel/iron-man/). Is there any benefit in doing a 301 for the "Hulk" page to live at /marvel/hulk/ like it's sibling pages? Is there any harm long-term in leaving the "Hulk" page under a permanently redirected page? Thank you! Matt
Intermediate & Advanced SEO | | amag0 -
Redirect closed shop to main shop, or keep the domain and content alive and use it for link building?
Hello, We used to have two shops selling our products, a small shop with a small selection of only our best quality products (domain smallshop.com), and a big shop with everything (bigshop.com). It used to make sense (without going into full detail), but it's not relevant anymore, and so we decided to stop maintaining the small shop, because it was time consuming and not worth it. There is some really good links pointing to smallshop.com, and the content is original (the product descriptions are different between both shops). So far, we just switch the "add to cart" button on the small shop into a link to the same product on the big shop, and did links from the small shop to the big shop also on categories pages. So the question is: in your opinion, is it better to do that, keep the small shop and content alive and build links to our big shop, or do 301 redirections and shut down completely the small shop ? Thanks for your opinion!
Intermediate & Advanced SEO | | Colage0 -
Going from 302 redirect to 301 redirect weeks after changing URL structure
I made a small change on an ecommerce site that had big impacts I didn't consider... About six weeks ago in an effort to clean up one of many SEO-related problems on an ecommerce site, I had a developer rewrite the URLs to replace underscores with hyphens and redirect all pages throughout the site to that page with the new URL structure. We didn't immediately update our sitemap to reflect the changes (bad!) and I just discovered all the redirects are 302s... Since these changes, most of the pages have a page authority of 1 and we have dropped several spots in organic search. If we were to setup 301 redirects for the pages that we changed the URL structure would there be any changes in organic search placement and page authority or is it too late?
Intermediate & Advanced SEO | | Nobody16116990439410 -
Has there been a 'Panda' update in the UK?
My site in the UK suddenly dropped from page 1 and out of top 50 for all KWs using 'recliner' or a derivative. We are a recliner manufacturer and have gained rank over 15 years, and of course using all white hat tactics. Did Google make an algo update in the Uk last week?
Intermediate & Advanced SEO | | KnutDSvendsen0 -
HT.Access Redirect Question
Quick question on the HT.Access / Redirects... II have a site http://www.securitysystemsfortlauderdale.org/ADT-Home-Security-Alarm-Systems/ and I am running througth SEO moz for backlinks and noticed a large descrepancy on the links on the root vs the redirect. There are more links on the root and less on the redirect. Does this affect SEO for Google or does Google follow the redirects and give credit accordingly. Thanks for your help!!! Matt
Intermediate & Advanced SEO | | joeups0 -
Is 404'ing a page enough to remove it from Google's index?
We set some pages to 404 status about 7 months ago, but they are still showing in Google's index (as 404's). Is there anything else I need to do to remove these?
Intermediate & Advanced SEO | | nicole.healthline0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0