Best Practice to Remove a Blog
-
Note: Re-posting since I accidentally marked as answered
Hi,
I have a blog that has thousands of URL, the blog is a part of my site.
I would like to obsolete the blog, I think the best choices are
1. 404 Them: Problem is a large number of 404's. I know this is Ok, but makes me hesitant.
2. meta tag no follow no index. This would be great, but the question is they are already indexed.
Thoughts?
Thanks
PS
A 301 redirect to the main page would be flagged as a soft 404
-
Subdomains are treated slightly differently by Google. Essentially they are seen as less connected to the rest of your content than a subfolder.
Take wordpress.com as an example:
- surferdude.wordpress.com has little relation to www.wordpress.com
- surferdude.wordpress.com has little relation to skaterguy.wordpress.com
- surferdude.wordpress.com has lots in common with surferdude.wordpress.com/surfboards/***
In the same regard, www.yourdomain.com/blog is more correlated with www.yourdomain.com than blog.yourdomain.com would be.
By using www.yourdomain.com/blog instead of a subdomain, you build more value to your www subdomain, everytime you post blog content or get links to your blog. This has more value to the rest of the www content on your site.
-
I agree also. Thank you
As far as subdomain or subfolder, I see no difference. Can you explain Kane?
-
Agree with Kane. If you're going to be building a blog elsewhere then just setup a 301 redirect to that.
-
In that case, it doesn't sound like there are any blog posts that get frequent traffic from referrals? If that's the case, everything should get a broad 301 redirect to the new blog page. This can typically be done in one redirect depending on your URL structure, so you don't have to do each and every URL.
On the topic of subdomains, subfolders are typically a better choice for SEO purposes.
-
The blog has little value, with almost no user traffic.
It will be redesigned in a subdomain on the site.
I am only concerned with crawlers/google crawlers etc..and being penalized for tons of missing pages by 404'ing
There is nothing linking to the blog
-
A few other questions for you first:
- Why on earth are you getting rid of everything?
- Are you going to replace that content with new content - either now or eventually?
- Is there any other content on your site that is relevant to the articles?
A few broad answers that I can say without hesitation:
- No, absolutely do not leave a bunch of 404s. IMO, everything should 301 somewhere. Sending people to relevant content is best, but sending them all to the homepage or a landing page that says "sorry but we deleted our blog" is better than a 404.
- No, "noindex/nofollow" is not worthwhile. If you want to keep the content and deindex it, choose "noindex/follow." At least then you keep some of the value of the pages (they can continue spreading some of their value to other pages on the site).
-
Hiya,
Without knowing a little more about your site and the blog here are some things I would consider:
I'm going to assume that you're trying to decide what to do with the blog while still retaining the maximum benefits for the overall seo of your site.
You say that the blog has thousands of URLs. What you need to do is determine how many sites are linking to your blog content. (You can do this using Open Site Explorer or look in Google webmaster tools or Google Analyrics to see who is reffering traffic.
The first question I would ask is whether you need to remove the content at all? Would it be possible just to put up a banner on top of the existing pages to say that the blog is no longer active.
How many search visitors does the blog get? If the blog posting are getting visitors, then you need to ask yourself if you're happy to give these up.
Would anyone else be interested in taking over the blog?
If you decide to remove you content:
Put 301 redirects to direct traffic to you main site. You'll preserve some of the value of your inbound links.
Do your blog pages relate to specific content on the main site that may be of interest to the visitor? If you can determine specific pages that are strongly related to the removed pages then link to those.
I wouldn't just remove the pages and respond with a 404 error. You'll lose any value from the links to those pages.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to remove Parameters from Google Search Console?
Hi All, Following are parameter configuration in search console - Parameters - fl
Technical SEO | | adamjack
Does this parameter change page content seen by the user? - Yes, Changes, reorders, or narrows page content.
How does this parameter affect page content? - Narrow
Which URLs with this parameter should Googlebot crawl? - Let Googlebot decide (Default) Query - Actually it is filter parameter. I have already set canonical on filter page. Now I am doing tracking of filter pages via data layer and tag manager so in google analytic I am not able to see filter url's because of this parameter. So I want to delete this parameter. Can anyone please help me? Thanks!0 -
Redirect Process for Moving a Blog
Hi, I've read several articles about the correct process for moving a blog from a subdomain to the main root domain but am not quite 100% sure as to what to do in our scenario. They were hosting their blog on Hubspot which puts the blog on a sub-domain "blog.rootdomain.com". Realizing it isn't benefiting the main website for SEO they want to move it to the main website. I understand we have to redirect the Hubspot "blog." pages to the new "rootdomain.com/blog" pages but when transferred over (it's a WordPress site) it shows the dates. So, the URL is "rootdomain.com/blog/year/month/title". They want to remove the date. Does that mean the URL must be re-written then redirected so that there's no date showing? There's over 300 posts which will have to be redirected from the Hubspot URLs. Is there a way to avoid setting up the second redirect to remove the dates or make it easier so it isn't one page at a time?
Technical SEO | | Flock.Media0 -
Link in my blog to external pages - no follow?
Hi everyone. I'm quite new to SEO (started 6 months ago, but I'm very lucky to work the company that is willing to pay while I'm learning). I also create some content for the website - I write wood flooring industry blog. Following some advise from few SEO experts around the world whom I follow I have decided to write 1-2 off topic blogs/month but I have found the way of writing off topic but in reality bringing together wood flooring industry and let's say fashion industry. My question is - if I want to link to one or two pages (let's say bug fashion brands) shall I use no-follow link? I do not want to harm my website or theirs. Can those type of posts be the part of building my position within my industry and in general, building authority of my website? Of course appart from getting links TO my website. Tom
Technical SEO | | ESB0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Best Practices for adding Dynamic URL's to XML Sitemap
Hi Guys, I'm working on an ecommerce website with all the product pages using dynamic URL's (we also have a few static pages but there is no issue with them). The products are updated on the site every couple of hours (because we sell out or the special offer expires) and as a result I keep seeing heaps of 404 errors in Google Webmaster tools and am trying to avoid this (if possible). I have already created an XML sitemap for the static pages and am now looking at incorporating the dynamic product pages but am not sure what is the best approach. The URL structure for the products are as follows: http://www.xyz.com/products/product1-is-really-cool
Technical SEO | | seekjobs
http://www.xyz.com/products/product2-is-even-cooler
http://www.xyz.com/products/product3-is-the-coolest Here are 2 approaches I was considering: 1. To just include the dynamic product URLS within the same sitemap as the static URLs using just the following http://www.xyz.com/products/ - This is so spiders have access to the folder the products are in and I don't have to create an automated sitemap for all product OR 2. Create a separate automated sitemap that updates when ever a product is updated and include the change frequency to be hourly - This is so spiders always have as close to be up to date sitemap when they crawl the sitemap I look forward to hearing your thoughts, opinions, suggestions and/or previous experiences with this. Thanks heaps, LW0 -
Best practice for XML sitemap depth
We run an eCommerce for education products with 20 or so subject based catalogues (Maths, Literacy etc) and each catalogue having numerous ranges (Counting, Maths Games etc) then products within those. We carry approximately 15,000 products. My question is around the sitemap we submit - nightly - and it's depth. It is currently set to cover off home, catalogues and ranges plus all static content (about us etc). Should we be submitting sitemaps to include product pages as well? Does it matter or would it not make much difference in terms of search. Thanks in advance.
Technical SEO | | TTS_Group0 -
Best Implementation of a Title Tag
If My Targeted keyword are: Mussoorie Hotels Hotels in Mussoorie Mussoorie Resorts Resorts in Mussoorie What of the below 3 will be the best Title Tag After Panda and Penguine ? Hotels and Resorts in Mussoorie Mussoorie Hotels | Mussoorie Resorts | Luxury Budget & Economical Accommodation in Mussoorie Mussoorie Hotels, Mussoorie Resorts, Hotels in Mussoorie, Resorts in Musoorie please suggest!
Technical SEO | | WildHawk0 -
What is the best website structure for SEO?
I've been on SEOmoz for about 1 month now and everyone says that depending on the type of business you should build up your website structure for SEO as 1st step. I have a new client click here ( www version doesn't work)... some bugs we are fixing it now. We are almost finished with the design & layout. 2nd question have been running though my head. 1. What would the best url category for the shop be /products/ - current url cat ex: /products/door-handles.html 2. What would you use for the main menu as section for getting the most out of SEO. Personally i am thinking of making 2-3 main categories on the left a section where i can add content to it (3-4 paragraphs... images maybe a video).So the main page focuses on the domain name more and the rest of the sections would focus on specific keywords, this why I avoid cannibalization. Main keyword target is "door handles" Any suggestions would be appreciated.
Technical SEO | | mosaicpro0