Blog separate from Website
-
One of my clients has a well established website, and a well established blog - each with its own domain. Is there any way to move the blog to his website domain without losing the SEO and links that he has built up over time?
-
I am with Erwan. You have essentially two authority sites, two authority domains that link to each other and support each other. That is golden and you want to keep taking advantage of that
Sure, you can move the blog and use 301s, and it would "work" but I bet you would lose some ranking.
-
LEAVE AS : the blog website can link to other websites of your network and improve their domain autority.
LEAVE AS : you will lose blog comments / social shares.
-
Thanks for the responses and suggestions. Both responses have been for consolidation - is there an argument that it would be better to leave as is - or should I definitely be thinking about moving it over?
-
Use 301 redirection old page->new page for every page; see the Google "moving your site page" https://support.google.com/webmasters/answer/83105?hl=en
-
Brent,
What I would suggest is moving the blog from the blog site to the main domain. I would set it up so that people are automatically redirected to the main web site when they hit the blog (I think both Word Press and Blogger will do this for around $20 per year). I would import the content from the old blog and update the tags to make are they are consistent and that you do not have any duplicate content on the main web site. This way you do not lose your old following and you start building inbound links and following on your main web site.
Hope this helps,
Ron
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website blog is hacked. Whats the best practice to remove bad urls
Hello So our site was hacked which created a few thousand spam URLs on our domain. We fixed the issue and changed all the spam urls now return 404. Google index shows a couple of thousand bad URLs. My question is- What's the fastest way to remove the URLs from google index. I created a site map with sof the bad urls and submitted to Google. I am hoping google will index them as they are in the sitemap and remove from the index, as they return 404. Any tools to get a full list of google index? ( search console downloads are limited to 1000 urls). A Moz site crawl gives larger list which includes URLs not in Google index too. Looking for a tool that can download results from a site: search. Any way to remove the URLs from the index in bulk? Removing them one by one will take forever. Any help or insight would be very appreciated.
Technical SEO | | ajiabs1 -
Page Authority for localized version of website
Hello everyone, I have a case here were I need to decide which steps to take to improve page authority (and thus SEO value) for the German pages on our site. We localized the English version into German at the beginning of 2015. www.memoq.com - English de.memoq.com - German By October 2015 we implemented href tags so that Google would index the pages according to their language. That implementation has been successful. There is one issue though: At that time, all our localized pages had only "1" point for Page Authority ("PA" in the Moz bar). At the beginning we though that this could be due to the fact that localization was done using subdomains (de.memoq.com) rather that subfolders (www.memoq.com/de). However, we decided not to implement changes and to let Google assess the work we had done with the href tags. Its been a while now, and still all our German pages keep having only "1" point for page authority. Plus we have keywords for which we rank in the top 10 in English (US Google Search), but this not the case for the translated version of the keywords for German (Germany Google search). So my question basically is: Is this lack of page authority and SEO value rooted in the fact that we used subdomain instead of subfolder for the URL creation. If so is it likely that Page Authority for German pages and SEO value will increase if I change the structure from subdomains to subfolders? Or is it that the problem in PA is rooted somewhere else that I am missing? I appreciate your feedback.
Technical SEO | | Kilgray0 -
Infinite scroll blog
We're currently re-designing a website for our client, which includes an infinite scroll blog. When driving traffic to individual blog posts, is there a particular way to provide them with a unique URL to view the desired blog post? Thanks.
Technical SEO | | SymbiontGroup0 -
Server 500: website deindexed?
Hi mozzers, Since August 22nd, (not a site I manage) has had a Server error 500 and all the pages got deindexed? This is obviously a server issue but why it got deindexed is it because it's been a while since it had this server issue? On the pages I checked the pages loads correctly so I am a bit confused here! His webmaster account show 1500 server errors! Can someone tell me what is going on and how to fix it? Thanks
Technical SEO | | Ideas-Money-Art0 -
Creating a Blog of Rodent Removal Companies?
I am helping a small company. Lets say rodent removal is their service. But local SEO for rodent removal is very very competitive in my town and across America. Would a website/blog dedicated to highlighting rodent removers across America be good for my company? We have had nice success with wordpress.com blogs. Supposing I gave 6 other rodent removal companies a free guest post (always 300 words or more) or whatever to post on my blog. Of course, none of these companies would be in my market. Would that help my local SEO? I am thinking long term here?
Technical SEO | | greenhornet770 -
Wordpress Blog Blocked by Metarobots
Upon receiving my first crawl report from new pro SEOMoz acc (yaay!) I've found that the wordpress blog plugged into my site hasn't been getting crawled due to being blocked by metarobots. I'm not a developer and have very little tech expertise, but a search dug up that the issue stemmed from the wordpress site settings > privacy > Ask search engines not to index this site option being selected. On checking the blog "Allow search engines to index this site" was selected so I'm unsure what else to check. My level of expertise means I'm not confident going into the back end of the site and I don't have a tech guy on site to speak to. Has anyone else had this problem? Is it common and will I need to consult a developer to get this fixed? Many thanks in advance for your help!
Technical SEO | | paj19790 -
Rebuilding an old website
Since we have a strong website; meaning high traffic, but we got 2 issues 1. the framework of the design is not user friendly. 2. the current platform is really old; therefor it comes up with technical problems daily/ We are worried about our links which will affect in our new design, what would be wise to do? Thanks
Technical SEO | | apexcue0 -
Duplicate Content within Website - problem?
Hello everyone, I am currently working on a big site which sells thousands of widgets. However each widget has ten sub widgets (1,2,3... say) My strategy with this site is to target the long tail search so I'm creating static pages for each possibly variation. So I'll have a main product page on widgets in general, and also a page on widget1, page on widget2 etc etc. I'm anticipating that because there's so much competition for searches relating to widgets in general, I'll get most of my traffic from people being more specific and searching for widget1 or widget 7 etc. Now here's the problem - I am getting a lot of content written for this website - a few hundred words for each widget. However I can't go to the extreme of writing unique content for each sub widget - that would mean 10's of 1,000's of articles. So... what do I do with the content. Put it on the main widget page was the plan but what do I do about the sub pages. I could put it there and it would make perfect sense to a reader and be relevant to people specifically looking for widget1, say, but could there be a issue with it being viewed as duplicate content. One idea was to just put a snippet (first 100 words) on each sub page with a link back to the main widget page where the full copy would be. Not sure whether I've made myself clear at all but hopefully I have - or I can clarify. Thanks so much in advance David
Technical SEO | | OzDave0