A Blog Structure Dilemma We're Facing...
-
We're launching a pretty large content program (in the form of a blog) and have a structure issue:
Big fans of Wordpress for efficiency reasons, but our platform doesn't allow hosting of a wordpess (or other 3rd party) blog on the primary domain where we want it. site.com/blog
Here are the options:
1. Sub-domain: We can easily put it there. Benefit is we use the efficient Wordpress tools and very fast to setup etc. Downside is that the root domain won't get benefit of any backlinks to the blog (as far as I understand). I also don't believe the primary domain will benefit from the daily fresh/unique content the blog offers.
2. Custom Rig: We could create our own manual system of pages on the site to look just like our blog would. This would allow us to have it at site.com/blog and benefit from any backlinks and fresh content. The downside is that it won't be as efficient to manage.
3. External Site: Create a different site just for the blog. Same issue as the sub-domain I believe.
User Experience is a top priority, and all of the above pretty much can accomplish the same UX goal, with #3 requiring a some additional strategy on positioning.
Is #1 of #3 going to be a big regret down the road though, and is the backlink/content benefit clearly worth doing #2?
(correct me if I'm wrong on my assumptions with #1 but at least with the backlinks I'm almost certain that's the case)
Many thanks for your inputs on this.
-
Matt Cutts
Subdomains vs. Subdirectories What's the difference between using subdomains and subdirectories? When it comes to Google, there aren't major differences between the two, so when you're making that decision, do what works for you and your visitors. http://googlewebmastercentral.blogspot.com/2008/01/feeling-lucky-at-pubcon.html
Deb, it really is a pretty personal choice. For something small like a blog, it probably won’t matter terribly much. I used a subdirectory because it’s easier to manage everything in one file storage space for me. However, if you think that someday you might want to use a hosted blog service to power your blog, then you might want to go with blog.example.com just because you could set up a CNAME or DNS alias so that blog.example.com pointed to your hosted blog service. http://www.mattcutts.com/blog/subdomains-and-subdirectories/
-
I also noticed that the sitelinks often include links from subdomains.
And Matt Cutts has said its a personal choice, and GWMB states it makes no difference to them.
I have had good results so far with Sub Domains, I remeber asking you for advice about a year or 2 ago. you recommended good linking between sub and root domains to show the connection.
i have followed that advice, and the sitelinks for my sites in google reflect the subdomians as sub categories of the root.
so i am convinced subdomains act like subfolders, at least they have so far for me.
-
Great idea -- and the link Scot posted is perfect. However our platform doesn't give us access to mod_proxy or htaccess, so we are unable to setup the reverse proxy. unfortunately. Sigh.
-
Agreed - Google is consolidating subdomain links in Google Webmaster Tools, but as far as I know, that does not reflect a change in how the algorithm works. Subdomains can still fragment and split link-juice. The change is more of an accounting trick, for lack of a better word.
-
Thanks, Hugh! I'm in the same boat as SEOPA with 3dcart and this seems like the best solution.
This post by Slingshot SEO seems relevant (What is a Reverse Proxy and How Can it Help My SEO?).
-
Hm. Right, I think I have another suggested solution of sorts - it's tricksy and you'd need an expert to set it up, but it'd solve your problems.
In short, if you run a reverse proxy serving your site itself on a server which ISN'T your BigCommerce server, you can tell it to fetch your main site for your www.yourdomain.com URL, and your blog (live, not cached) for www.yourdomain.com/blog. Probably your best option would be to use a reverse proxy like Varnish or Nginx, both of which are normally used for performance reasons - however, they can also be used to effectively "combine" two servers into one.
So, you'd move your DNS record to point to the reverse proxy, then set the proxy up to fetch content from your ecommerce site and your blog site.
Issues:
-
You'd need another server, and you'd need root access and an expert sysadmin to set it all up.
-
I don't know how well BigCommerce would handle a reverse proxy - but frankly, they SHOULD be able to handle it OK if you talk to their sysadmins.
Advantages:
- This would also give you massive redundancy in case of high traffic - reverse proxy setups are usually used to improve performance. You'd be Digg-proof!
It's complex, but I can see it working! Just another suggestion.
More info on reverse proxies - http://en.wikipedia.org/wiki/Reverse_proxy
-
-
Hm. It seems to me that you've just got a routing issue - there MUST be a way to fix this.
Can you run a mod_rewrite .htaccess or similar on the server?
-
It's a platform issue. BigCommerce. Everything else has been fantastic with them, but our only option for WordPress is to host on a subdomain.
The clear answer is that having it in a directory is better, but doing so means we need to have a very manual setup and lose the efficiencies/functionality of wordpress.
-
No opinion here.
In late 2010 we redirected to popular subdomains to folders in the root. The results have been kickass. Kickass.
-
Here is a post from earlier in the year with a similar discussion (didn't see that one before I posted this). Also looks like similar differences of opinion, though some more sources sited. http://www.seomoz.org/q/corporate-blog
Because of the lack of consensus, I'm curious to research more. Just want to make sure I/we didn't miss anything over the past few months.
-
The problem with this idea, it occurs to me on second thoughts, will be comments. Having dynamically user-generated content will be tricky with this workaround.
Aside from that, rsync and W3TC are both enterprise-level stable solutions, so it SHOULD work - but I agree, it's doing something new, and new's always a bit risky.
Would you be able to go into any detail as to why you can't host WP? Is it a hosting company issue, a platform language issue, or something else?
-
James: do you have a source for the statement that Google now treats subs as a key site element?
-
Interesting. I need to research this more. It sounds like it's prone to errors, but maybe not.
-
If I could not have the blog that is going to receive massive work in a subfolder I would be looking for a different platform for the site or a different method of creating the blog.
Placing that blog on a subdomain or on a satelite site is like tossing away great content imo.
-
Google now treats sub domains as a key element of the site
[citation needed]
Though I know you're talking about - http://googlewebmastercentral.blogspot.com/2011/08/reorganizing-internal-vs-external.html
However, as far as I'm aware, there's no information yet as to how Google are changing the weighting of these links (or even if they are), so I'd still be wary of charging ahead with a subdomain
-
Can you rsync or otherwise automatically copy content onto your primary web server? If so, there may be a way to combine the best of all worlds.
Set up your Wordpress platform somewhere else - doesn't matter where. Make sure Google isn't crawling it to avoid duplicate content penalties.Install a caching solution like W3 Total Cache which writes the entire blog as static HTML to the disk.
Now, have a frequently-updating automatic synchronisation tool copy those files from the location on your blog server to the local directory on your web server corresponding to yourdomain.com/blog . Set up the same rewrite rules on your main server as W3TC uses on your blog server.
You should now have an automatically-updated static copy of your blog hosted under yourdomain.com/blog . As a bonus, it'll be fast as hell and stable as a large room full of horses.
The actual setup's a bit of a faff, but my (non-pro SEO) intuition is that it'll be the best solution SEO-wise.
-
Thanks for the input, James. Agreed on the external site. I didn't know about subs being treated as a key element now. So other sites linking to posts on the blog (if the blog is on blog.site.com) will still benefit the primary domain?
Having it in a folder is doable, but more difficult to manage ongoing. I think it's a question of 'how much better' is it to have at site.com/blog...
-
If you can not get it onto a sub folder ie site.com/blog then the next best is to have it on a sub domain blog.site.com
Google now treats sub domains as a key element of the site, yet sub folders work better for internal linking.
I would not put it on an external site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
Is there a difference between 'Mø' and 'Mo'?
The brand name is Mø but users are searching online for Mo. Should I changed all instances of Mø to be Mo on my clients website?
Intermediate & Advanced SEO | | ben_mozbot010 -
Should I write a new page or a blog post?
I am trying to rank for a local SEO term on a website for a national company. Should I write an optimized blog post, or optimized site page? Does it make a difference? Thanks!
Intermediate & Advanced SEO | | aj6130 -
What would you do with this odd Blog configuration?
G'day fellow digital marketers! I'm analysing a new site in analytics. Having been hit by Panda, and being plagued with a variety of issues - one that's got me scratching my head is their blog configuration. Now, on most sites you'll have something like: www.clientsite.com/blog All blog posts would then sit under the blog page: www.clientsite.com/blog/this-is-a-blog-post Anyway, on this blog - when a new post has been created historically - they've all been placed directly under the homepage, so when you click a link on: www.clientsite.com/blog You instead arrive at: www.clientsite.com/this-is-a-blog-post So, we have 150+ posts equally sharing the hompage authority - detracting from their ability to rank for their core services pages. I'm thinking of going to town on the 301 re-direct wagon, changing: www.clientsite.com/this-is-a-blog-post to www.clientsite.com/blog/this-is-a-blog-post But I'd like to know your thoughts and experiences before I get to work! Thanks in advance guys and gals, John.
Intermediate & Advanced SEO | | Muhammad-Isap0 -
What's with the Keyword Apocalypse?
Hi, 9 of my tracked keywords have dropped by over 20 ranks since last week. The nastiest drops in ranking are by 36, 38, and 46 places. For the last month I have been chipping away at the duplicate content with 301 redirects and was expecting my keyword rankings to improve slightly as a result of this; not the opposite. I don't have any manual actions logged against my site and am at a bit of a loss to explain this sudden drop. Any suggestions would be most welcome.
Intermediate & Advanced SEO | | McCaldin1 -
- Truth ? ''link building isn't considered a suitable way of promotion as per recent search engine updates''
I need SEO. A SEO consultant said: ''link building isn't considered a suitable way of promotion as per recent search engine updates'' they mention: ''Therefore we would be undertaking a range of promotional exercises such as blog postings, social book marking, press release, etc that are more effective for ensuring best possible rankings for the website.'' Do you agree? Thank you
Intermediate & Advanced SEO | | BigBlaze2051 -
No matter what I do, my website isn't showing up in search results. What's happening?
I've checked for meta-robots, all SEO tags are fixed, reindexed with google-- basically everything and it's not showing up. According to SEOMoz all looks fine, I am making a few fixes, but nothing terribly major. It's a new website, and i know it takes a while, but there is no movement here in a month. Any insights here?
Intermediate & Advanced SEO | | Wabash0 -
Is there a FastTrack to re-index? a site?
Hello... i just started with a new client this week, before working with us his last domain-hosting-webdev provider cancel their account and took off the entire site and left them with a nice "under construction page" (NOT) and added the noindex, nofollow tags. 4 weeks after that, we come into the scene and of course our client it's expecting us to reinsert at least for branded terms the site, and he wants it done on a matter of hours... I tried my best to explain that it's not possible and we are doing everything we can't.... now i ask you guys.. I already created de GWT account, Created a well structured Sitemap and submitted it to google and bing, did the onpage optimizitation at least the basics... there is a way to speed up the process? kind of like "hey you! google bot, forget about the noindex nonsense a come crawl again?" Any help would be great Daniel
Intermediate & Advanced SEO | | daniel.alvarez0