A Blog Structure Dilemma We're Facing...
-
We're launching a pretty large content program (in the form of a blog) and have a structure issue:
Big fans of Wordpress for efficiency reasons, but our platform doesn't allow hosting of a wordpess (or other 3rd party) blog on the primary domain where we want it. site.com/blog
Here are the options:
1. Sub-domain: We can easily put it there. Benefit is we use the efficient Wordpress tools and very fast to setup etc. Downside is that the root domain won't get benefit of any backlinks to the blog (as far as I understand). I also don't believe the primary domain will benefit from the daily fresh/unique content the blog offers.
2. Custom Rig: We could create our own manual system of pages on the site to look just like our blog would. This would allow us to have it at site.com/blog and benefit from any backlinks and fresh content. The downside is that it won't be as efficient to manage.
3. External Site: Create a different site just for the blog. Same issue as the sub-domain I believe.
User Experience is a top priority, and all of the above pretty much can accomplish the same UX goal, with #3 requiring a some additional strategy on positioning.
Is #1 of #3 going to be a big regret down the road though, and is the backlink/content benefit clearly worth doing #2?
(correct me if I'm wrong on my assumptions with #1 but at least with the backlinks I'm almost certain that's the case)
Many thanks for your inputs on this.
-
Matt Cutts
Subdomains vs. Subdirectories What's the difference between using subdomains and subdirectories? When it comes to Google, there aren't major differences between the two, so when you're making that decision, do what works for you and your visitors. http://googlewebmastercentral.blogspot.com/2008/01/feeling-lucky-at-pubcon.html
Deb, it really is a pretty personal choice. For something small like a blog, it probably won’t matter terribly much. I used a subdirectory because it’s easier to manage everything in one file storage space for me. However, if you think that someday you might want to use a hosted blog service to power your blog, then you might want to go with blog.example.com just because you could set up a CNAME or DNS alias so that blog.example.com pointed to your hosted blog service. http://www.mattcutts.com/blog/subdomains-and-subdirectories/
-
I also noticed that the sitelinks often include links from subdomains.
And Matt Cutts has said its a personal choice, and GWMB states it makes no difference to them.
I have had good results so far with Sub Domains, I remeber asking you for advice about a year or 2 ago. you recommended good linking between sub and root domains to show the connection.
i have followed that advice, and the sitelinks for my sites in google reflect the subdomians as sub categories of the root.
so i am convinced subdomains act like subfolders, at least they have so far for me.
-
Great idea -- and the link Scot posted is perfect. However our platform doesn't give us access to mod_proxy or htaccess, so we are unable to setup the reverse proxy. unfortunately. Sigh.
-
Agreed - Google is consolidating subdomain links in Google Webmaster Tools, but as far as I know, that does not reflect a change in how the algorithm works. Subdomains can still fragment and split link-juice. The change is more of an accounting trick, for lack of a better word.
-
Thanks, Hugh! I'm in the same boat as SEOPA with 3dcart and this seems like the best solution.
This post by Slingshot SEO seems relevant (What is a Reverse Proxy and How Can it Help My SEO?).
-
Hm. Right, I think I have another suggested solution of sorts - it's tricksy and you'd need an expert to set it up, but it'd solve your problems.
In short, if you run a reverse proxy serving your site itself on a server which ISN'T your BigCommerce server, you can tell it to fetch your main site for your www.yourdomain.com URL, and your blog (live, not cached) for www.yourdomain.com/blog. Probably your best option would be to use a reverse proxy like Varnish or Nginx, both of which are normally used for performance reasons - however, they can also be used to effectively "combine" two servers into one.
So, you'd move your DNS record to point to the reverse proxy, then set the proxy up to fetch content from your ecommerce site and your blog site.
Issues:
-
You'd need another server, and you'd need root access and an expert sysadmin to set it all up.
-
I don't know how well BigCommerce would handle a reverse proxy - but frankly, they SHOULD be able to handle it OK if you talk to their sysadmins.
Advantages:
- This would also give you massive redundancy in case of high traffic - reverse proxy setups are usually used to improve performance. You'd be Digg-proof!
It's complex, but I can see it working! Just another suggestion.
More info on reverse proxies - http://en.wikipedia.org/wiki/Reverse_proxy
-
-
Hm. It seems to me that you've just got a routing issue - there MUST be a way to fix this.
Can you run a mod_rewrite .htaccess or similar on the server?
-
It's a platform issue. BigCommerce. Everything else has been fantastic with them, but our only option for WordPress is to host on a subdomain.
The clear answer is that having it in a directory is better, but doing so means we need to have a very manual setup and lose the efficiencies/functionality of wordpress.
-
No opinion here.
In late 2010 we redirected to popular subdomains to folders in the root. The results have been kickass. Kickass.
-
Here is a post from earlier in the year with a similar discussion (didn't see that one before I posted this). Also looks like similar differences of opinion, though some more sources sited. http://www.seomoz.org/q/corporate-blog
Because of the lack of consensus, I'm curious to research more. Just want to make sure I/we didn't miss anything over the past few months.
-
The problem with this idea, it occurs to me on second thoughts, will be comments. Having dynamically user-generated content will be tricky with this workaround.
Aside from that, rsync and W3TC are both enterprise-level stable solutions, so it SHOULD work - but I agree, it's doing something new, and new's always a bit risky.
Would you be able to go into any detail as to why you can't host WP? Is it a hosting company issue, a platform language issue, or something else?
-
James: do you have a source for the statement that Google now treats subs as a key site element?
-
Interesting. I need to research this more. It sounds like it's prone to errors, but maybe not.
-
If I could not have the blog that is going to receive massive work in a subfolder I would be looking for a different platform for the site or a different method of creating the blog.
Placing that blog on a subdomain or on a satelite site is like tossing away great content imo.
-
Google now treats sub domains as a key element of the site
[citation needed]
Though I know you're talking about - http://googlewebmastercentral.blogspot.com/2011/08/reorganizing-internal-vs-external.html
However, as far as I'm aware, there's no information yet as to how Google are changing the weighting of these links (or even if they are), so I'd still be wary of charging ahead with a subdomain
-
Can you rsync or otherwise automatically copy content onto your primary web server? If so, there may be a way to combine the best of all worlds.
Set up your Wordpress platform somewhere else - doesn't matter where. Make sure Google isn't crawling it to avoid duplicate content penalties.Install a caching solution like W3 Total Cache which writes the entire blog as static HTML to the disk.
Now, have a frequently-updating automatic synchronisation tool copy those files from the location on your blog server to the local directory on your web server corresponding to yourdomain.com/blog . Set up the same rewrite rules on your main server as W3TC uses on your blog server.
You should now have an automatically-updated static copy of your blog hosted under yourdomain.com/blog . As a bonus, it'll be fast as hell and stable as a large room full of horses.
The actual setup's a bit of a faff, but my (non-pro SEO) intuition is that it'll be the best solution SEO-wise.
-
Thanks for the input, James. Agreed on the external site. I didn't know about subs being treated as a key element now. So other sites linking to posts on the blog (if the blog is on blog.site.com) will still benefit the primary domain?
Having it in a folder is doable, but more difficult to manage ongoing. I think it's a question of 'how much better' is it to have at site.com/blog...
-
If you can not get it onto a sub folder ie site.com/blog then the next best is to have it on a sub domain blog.site.com
Google now treats sub domains as a key element of the site, yet sub folders work better for internal linking.
I would not put it on an external site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
Blog on Custom Website - WordPress Alternatives
Howdy folks. I'm trying to find any good alternative for adding CMS blog to custom website. Most people, us included, are using WP, but, as we know, it's really painful to work with, easily hackable etc etc. So, I wonder if anyone knows of a blog platform, which can be installed on our own websites for blogging, but without drawbacks of WordPress. Any nudge in correct direction will be appreciated. Thanks!
Intermediate & Advanced SEO | | DmitriiK1 -
Should I redirect off topic blog posts?
We launched a store on top of a popular blog. The blog had nothing to do with the store. The blog has a lot of backlinks and traffic, but our store is now our primary business. I am concerned that the off topic blog content may be affecting or ability to rank better for the core store business. Should we delete or redirect the old blog content to another website to improve the SEO for our store?
Intermediate & Advanced SEO | | seo-mojo1 -
Alt tag for src='blank.gif' on lazy load images
I didn't find an answer on a search on this, so maybe someone here has faced this before. I am loading 20 images that are in the viewport and a bit below. The next 80 images I want to 'lazy-load'. They therefore are seen by the bot as a blank.gif file. However, I would like to get some credit for them by giving a description in the alt tag. Is that a no-no? If not, do they all have to be the same alt description since the src name is the same? I don't want to mess things up with Google by being too aggressive, but at the same time those are valid images once they are lazy loaded, so would like to get some credit for them. Thanks! Ted
Intermediate & Advanced SEO | | friendoffood0 -
URL Optimisation Dilemma
First of all, I fully appreciate that I may be over analysing this, so feel free to highlight if you think I’m going overboard on this one. I’m currently trying to optimise the URLs for a group of new pages that we have recently launched. I would usually err on the side of leaving the urls as they are so that any incoming links are not diluted through the 301 re-direct. In this case, however, there are very few links to these pages, so I don’t think that changing URLs will harm them. My main question is between short URLs vs. long URLs (I have already read Dr. Pete’s post on this). Note: the URLs I have listed below are not the actual URLs, but very similar examples that I have created. The URLs currently exist in a similar format to the examples below: http://www.company.com/products/dlm/hire-ca My first response was that we could put a few descriptive keywords in the url, with something like the following: http://www.company/products/debt-lifecycle-management/hire-collection-agents - I’m worried though that the URL will get too long for any pages sitting under this. As a compromise, I am considering the following: http://www.company/products/dlm/hire-collection-agents My feeling is that the second approach will give the best balance between having the keywords for the products and trying to ensure good user experience. My only concern is whether the /dlm/ category page would suffer slightly, but this would have ‘debt-lifecycle-management’ in the title tag. Does this sound like a good approach to people? Or do you think I’m being a little obsessive about this? Any help would be appreciated 🙂
Intermediate & Advanced SEO | | RG_SEO0 -
Blog Traffic
Hi all, As of today, we put up approximately 900 high-quality, 100% original articles on our blog. However, we have not been able to generate any good traffic since July when it was first launched (blog.ostanding.com). Any suggestion would be greatly appreciated! Thanks again.
Intermediate & Advanced SEO | | businessowner0 -
URL Structure
URL i have to use targeted keyword on all sub page domain or not for example now i am using url like this format fundingtype.html litigation-funding.html legal-funding.html financingservices.html process.html and if i re-write all url with targated keyword like this format lawsuit-loans-fundingtype.html lawsuit-loans-litigation-funding.html lawsuit-loans-legal-funding.html lawsuit-loans-financingservices.html lawsuit-loans-process.html so which type URL are more effective for best SEO ??
Intermediate & Advanced SEO | | JulieWhite0 -
Is it worth re-doing SEO for all existent products
We have a website and when we started, we had no clue about SEO, nor did we really understand the full extent of CRO amongst other things. We have slowly learnt that there are many changes that need to happen to our site; however...do we need to re SEO all the content that is already on the website or can we purely start a fresh with the new products we feed through? The website is: www.onlineforequine.co.uk if you need to take a look at the kind of platform we are working with.
Intermediate & Advanced SEO | | onlineforequine0