A Blog Structure Dilemma We're Facing...
-
We're launching a pretty large content program (in the form of a blog) and have a structure issue:
Big fans of Wordpress for efficiency reasons, but our platform doesn't allow hosting of a wordpess (or other 3rd party) blog on the primary domain where we want it. site.com/blog
Here are the options:
1. Sub-domain: We can easily put it there. Benefit is we use the efficient Wordpress tools and very fast to setup etc. Downside is that the root domain won't get benefit of any backlinks to the blog (as far as I understand). I also don't believe the primary domain will benefit from the daily fresh/unique content the blog offers.
2. Custom Rig: We could create our own manual system of pages on the site to look just like our blog would. This would allow us to have it at site.com/blog and benefit from any backlinks and fresh content. The downside is that it won't be as efficient to manage.
3. External Site: Create a different site just for the blog. Same issue as the sub-domain I believe.
User Experience is a top priority, and all of the above pretty much can accomplish the same UX goal, with #3 requiring a some additional strategy on positioning.
Is #1 of #3 going to be a big regret down the road though, and is the backlink/content benefit clearly worth doing #2?
(correct me if I'm wrong on my assumptions with #1 but at least with the backlinks I'm almost certain that's the case)
Many thanks for your inputs on this.
-
Matt Cutts
Subdomains vs. Subdirectories What's the difference between using subdomains and subdirectories? When it comes to Google, there aren't major differences between the two, so when you're making that decision, do what works for you and your visitors. http://googlewebmastercentral.blogspot.com/2008/01/feeling-lucky-at-pubcon.html
Deb, it really is a pretty personal choice. For something small like a blog, it probably won’t matter terribly much. I used a subdirectory because it’s easier to manage everything in one file storage space for me. However, if you think that someday you might want to use a hosted blog service to power your blog, then you might want to go with blog.example.com just because you could set up a CNAME or DNS alias so that blog.example.com pointed to your hosted blog service. http://www.mattcutts.com/blog/subdomains-and-subdirectories/
-
I also noticed that the sitelinks often include links from subdomains.
And Matt Cutts has said its a personal choice, and GWMB states it makes no difference to them.
I have had good results so far with Sub Domains, I remeber asking you for advice about a year or 2 ago. you recommended good linking between sub and root domains to show the connection.
i have followed that advice, and the sitelinks for my sites in google reflect the subdomians as sub categories of the root.
so i am convinced subdomains act like subfolders, at least they have so far for me.
-
Great idea -- and the link Scot posted is perfect. However our platform doesn't give us access to mod_proxy or htaccess, so we are unable to setup the reverse proxy. unfortunately. Sigh.
-
Agreed - Google is consolidating subdomain links in Google Webmaster Tools, but as far as I know, that does not reflect a change in how the algorithm works. Subdomains can still fragment and split link-juice. The change is more of an accounting trick, for lack of a better word.
-
Thanks, Hugh! I'm in the same boat as SEOPA with 3dcart and this seems like the best solution.
This post by Slingshot SEO seems relevant (What is a Reverse Proxy and How Can it Help My SEO?).
-
Hm. Right, I think I have another suggested solution of sorts - it's tricksy and you'd need an expert to set it up, but it'd solve your problems.
In short, if you run a reverse proxy serving your site itself on a server which ISN'T your BigCommerce server, you can tell it to fetch your main site for your www.yourdomain.com URL, and your blog (live, not cached) for www.yourdomain.com/blog. Probably your best option would be to use a reverse proxy like Varnish or Nginx, both of which are normally used for performance reasons - however, they can also be used to effectively "combine" two servers into one.
So, you'd move your DNS record to point to the reverse proxy, then set the proxy up to fetch content from your ecommerce site and your blog site.
Issues:
-
You'd need another server, and you'd need root access and an expert sysadmin to set it all up.
-
I don't know how well BigCommerce would handle a reverse proxy - but frankly, they SHOULD be able to handle it OK if you talk to their sysadmins.
Advantages:
- This would also give you massive redundancy in case of high traffic - reverse proxy setups are usually used to improve performance. You'd be Digg-proof!
It's complex, but I can see it working! Just another suggestion.
More info on reverse proxies - http://en.wikipedia.org/wiki/Reverse_proxy
-
-
Hm. It seems to me that you've just got a routing issue - there MUST be a way to fix this.
Can you run a mod_rewrite .htaccess or similar on the server?
-
It's a platform issue. BigCommerce. Everything else has been fantastic with them, but our only option for WordPress is to host on a subdomain.
The clear answer is that having it in a directory is better, but doing so means we need to have a very manual setup and lose the efficiencies/functionality of wordpress.
-
No opinion here.
In late 2010 we redirected to popular subdomains to folders in the root. The results have been kickass. Kickass.
-
Here is a post from earlier in the year with a similar discussion (didn't see that one before I posted this). Also looks like similar differences of opinion, though some more sources sited. http://www.seomoz.org/q/corporate-blog
Because of the lack of consensus, I'm curious to research more. Just want to make sure I/we didn't miss anything over the past few months.
-
The problem with this idea, it occurs to me on second thoughts, will be comments. Having dynamically user-generated content will be tricky with this workaround.
Aside from that, rsync and W3TC are both enterprise-level stable solutions, so it SHOULD work - but I agree, it's doing something new, and new's always a bit risky.
Would you be able to go into any detail as to why you can't host WP? Is it a hosting company issue, a platform language issue, or something else?
-
James: do you have a source for the statement that Google now treats subs as a key site element?
-
Interesting. I need to research this more. It sounds like it's prone to errors, but maybe not.
-
If I could not have the blog that is going to receive massive work in a subfolder I would be looking for a different platform for the site or a different method of creating the blog.
Placing that blog on a subdomain or on a satelite site is like tossing away great content imo.
-
Google now treats sub domains as a key element of the site
[citation needed]
Though I know you're talking about - http://googlewebmastercentral.blogspot.com/2011/08/reorganizing-internal-vs-external.html
However, as far as I'm aware, there's no information yet as to how Google are changing the weighting of these links (or even if they are), so I'd still be wary of charging ahead with a subdomain
-
Can you rsync or otherwise automatically copy content onto your primary web server? If so, there may be a way to combine the best of all worlds.
Set up your Wordpress platform somewhere else - doesn't matter where. Make sure Google isn't crawling it to avoid duplicate content penalties.Install a caching solution like W3 Total Cache which writes the entire blog as static HTML to the disk.
Now, have a frequently-updating automatic synchronisation tool copy those files from the location on your blog server to the local directory on your web server corresponding to yourdomain.com/blog . Set up the same rewrite rules on your main server as W3TC uses on your blog server.
You should now have an automatically-updated static copy of your blog hosted under yourdomain.com/blog . As a bonus, it'll be fast as hell and stable as a large room full of horses.
The actual setup's a bit of a faff, but my (non-pro SEO) intuition is that it'll be the best solution SEO-wise.
-
Thanks for the input, James. Agreed on the external site. I didn't know about subs being treated as a key element now. So other sites linking to posts on the blog (if the blog is on blog.site.com) will still benefit the primary domain?
Having it in a folder is doable, but more difficult to manage ongoing. I think it's a question of 'how much better' is it to have at site.com/blog...
-
If you can not get it onto a sub folder ie site.com/blog then the next best is to have it on a sub domain blog.site.com
Google now treats sub domains as a key element of the site, yet sub folders work better for internal linking.
I would not put it on an external site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Structural data in google webmaster tools
Hey, During the year I have done everything in my power to please Google with my website. Instead of building links towards the page I have focused on content, content and content. In addition I have worked with https and page speed. Today my site is faster than 98% of all tested sites in Pingdom tools and have 94/83 in Google insights. Of course we have had to build some links as well, perhaps 50 links in 8 months. At the same time we have built 700 pages of text. The total amount of links build is 180 over 20 months. On Thursday last week it looks like the site was penalized by Google. I still believe that we can do something about it and get the site back on track again. Hence we have been looking at technical things on the site, if there is anything Google don't like. One thing that I have found is structural data. For some reason this has dropped from 875 a month ago to 3 today. I have no clue why. Does anyone know how structural data works and what can have caused this problem. Would it be possible that we in our attempt to optimize the site might have done something that may affect the structural data? http://imgur.com/a/vurB1 In that case, what affect might this drop in structural data mean for SEO. Could that be a reason for the total drop in ranking? (we have basically been wiped on all our keywords) What I can see in Google webmaster tool about 975 pages are still indexed in Google which has been stable for a long time. Does anyone know more about structural data and what I can do about this?
Intermediate & Advanced SEO | | Enigma123
Thanks in advance! /A vurB10 -
How can a recruitment company get 'credit' from Google when syndicating job posts?
I'm working on an SEO strategy for a recruitment agency. Like many recruitment agencies, they write tons of great unique content each month and as agencies do, they post the job descriptions to job websites as well as their own. These job websites won't generally allow any linking back to the agency website from the post. What can we do to make Google realise that the originator of the post is the recruitment agency and they deserve the 'credit' for the content? The recruitment agency has a low domain authority and so we've very much at the start of the process. It would be a damn shamn if they produced so much great unique content but couldn't get Google to recognise it. Google's advice says: "Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you'd prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content." - But none of that can happen. Those big job websites just won't do it. A previous post here didn't get a sufficient answer. I'm starting to think there isn't an answer, other than having more authority than the websites we're syndicating to. Which isn't going to happen any time soon! Any thoughts?
Intermediate & Advanced SEO | | Mark_Reynolds0 -
SEO of blogging websites
What are the best practices of doing SEO of article/blogging websites.
Intermediate & Advanced SEO | | Obbserv0 -
Blog home page and ranking
My question is in regards to ranking a blog under our domain www.xxx.co.uk/blog If we are targeting pc blog should the home page have some content in the side bar or somewhere that stays there contantly.
Intermediate & Advanced SEO | | BobAnderson0 -
Most recent blog post isn't being indexed?
http://www.howlatthemoon.com/dueling_piano_bar/kids-activities-denver/ Even if I put the URL into Google it doesn't show up....
Intermediate & Advanced SEO | | howlusa0 -
Blog URL Canonical
Hi Guy's, I would like to know your thoughts on the following set-up for blog canonical. Option 1 domain.com/blog = <link rel="canonical" href="domin.com/blog"> domain.com/blog-category/general = <link rel="canonical" href="domain.com/blog"> domain.com/blog-article/how-to-set-canonical = no canonical option 2 domain.com/blog = <link rel="canonical" href="domin.com blog"="">(as option 1)</link rel="canonical" href="domin.com> domain.com/blog-category/general = <link rel="canonical" href="domain.com blog-category="" general"="">(this time has the canonical of the category)</link rel="canonical" href="domain.com> domain.com/blog-article/how-to-set-canonical = <link rel="canonical" href="domain.com blog-article="" how-to-set-canonical"="">(this time has the canonical of the article full URL)</link rel="canonical" href="domain.com> Just not sure which is the best option, or even if it is any of the above! Thanks Dan
Intermediate & Advanced SEO | | Dan1e10 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
Migrating a site with new URL structure
I recently redesigned a website that is now in WordPress. It was previously in some odd, custom platform that didn't work very well. The URL's for all the pages are now more search engine friendly and more concise. The problem is, now Google has all of the old pages and all of the new pages in its index. This is a duplicate problem since content is the same. I have set up a 301 redirect for every old URL to it's new counterpart. I was going to do a remove URL request in Webmaster Tools but it seems I need to have a 404 code and not a 301 on those pages to do that. Which is better to do to get the old URL's out of the index? 404 them and do a removal request or 301 them to the new URL? How long will it take Google to find these 301 redirects and keep just the new pages in the index?
Intermediate & Advanced SEO | | DanDeceuster0