A Blog Structure Dilemma We're Facing...
-
We're launching a pretty large content program (in the form of a blog) and have a structure issue:
Big fans of Wordpress for efficiency reasons, but our platform doesn't allow hosting of a wordpess (or other 3rd party) blog on the primary domain where we want it. site.com/blog
Here are the options:
1. Sub-domain: We can easily put it there. Benefit is we use the efficient Wordpress tools and very fast to setup etc. Downside is that the root domain won't get benefit of any backlinks to the blog (as far as I understand). I also don't believe the primary domain will benefit from the daily fresh/unique content the blog offers.
2. Custom Rig: We could create our own manual system of pages on the site to look just like our blog would. This would allow us to have it at site.com/blog and benefit from any backlinks and fresh content. The downside is that it won't be as efficient to manage.
3. External Site: Create a different site just for the blog. Same issue as the sub-domain I believe.
User Experience is a top priority, and all of the above pretty much can accomplish the same UX goal, with #3 requiring a some additional strategy on positioning.
Is #1 of #3 going to be a big regret down the road though, and is the backlink/content benefit clearly worth doing #2?
(correct me if I'm wrong on my assumptions with #1 but at least with the backlinks I'm almost certain that's the case)
Many thanks for your inputs on this.
-
Matt Cutts
Subdomains vs. Subdirectories What's the difference between using subdomains and subdirectories? When it comes to Google, there aren't major differences between the two, so when you're making that decision, do what works for you and your visitors. http://googlewebmastercentral.blogspot.com/2008/01/feeling-lucky-at-pubcon.html
Deb, it really is a pretty personal choice. For something small like a blog, it probably won’t matter terribly much. I used a subdirectory because it’s easier to manage everything in one file storage space for me. However, if you think that someday you might want to use a hosted blog service to power your blog, then you might want to go with blog.example.com just because you could set up a CNAME or DNS alias so that blog.example.com pointed to your hosted blog service. http://www.mattcutts.com/blog/subdomains-and-subdirectories/
-
I also noticed that the sitelinks often include links from subdomains.
And Matt Cutts has said its a personal choice, and GWMB states it makes no difference to them.
I have had good results so far with Sub Domains, I remeber asking you for advice about a year or 2 ago. you recommended good linking between sub and root domains to show the connection.
i have followed that advice, and the sitelinks for my sites in google reflect the subdomians as sub categories of the root.
so i am convinced subdomains act like subfolders, at least they have so far for me.
-
Great idea -- and the link Scot posted is perfect. However our platform doesn't give us access to mod_proxy or htaccess, so we are unable to setup the reverse proxy. unfortunately. Sigh.
-
Agreed - Google is consolidating subdomain links in Google Webmaster Tools, but as far as I know, that does not reflect a change in how the algorithm works. Subdomains can still fragment and split link-juice. The change is more of an accounting trick, for lack of a better word.
-
Thanks, Hugh! I'm in the same boat as SEOPA with 3dcart and this seems like the best solution.
This post by Slingshot SEO seems relevant (What is a Reverse Proxy and How Can it Help My SEO?).
-
Hm. Right, I think I have another suggested solution of sorts - it's tricksy and you'd need an expert to set it up, but it'd solve your problems.
In short, if you run a reverse proxy serving your site itself on a server which ISN'T your BigCommerce server, you can tell it to fetch your main site for your www.yourdomain.com URL, and your blog (live, not cached) for www.yourdomain.com/blog. Probably your best option would be to use a reverse proxy like Varnish or Nginx, both of which are normally used for performance reasons - however, they can also be used to effectively "combine" two servers into one.
So, you'd move your DNS record to point to the reverse proxy, then set the proxy up to fetch content from your ecommerce site and your blog site.
Issues:
-
You'd need another server, and you'd need root access and an expert sysadmin to set it all up.
-
I don't know how well BigCommerce would handle a reverse proxy - but frankly, they SHOULD be able to handle it OK if you talk to their sysadmins.
Advantages:
- This would also give you massive redundancy in case of high traffic - reverse proxy setups are usually used to improve performance. You'd be Digg-proof!
It's complex, but I can see it working! Just another suggestion.
More info on reverse proxies - http://en.wikipedia.org/wiki/Reverse_proxy
-
-
Hm. It seems to me that you've just got a routing issue - there MUST be a way to fix this.
Can you run a mod_rewrite .htaccess or similar on the server?
-
It's a platform issue. BigCommerce. Everything else has been fantastic with them, but our only option for WordPress is to host on a subdomain.
The clear answer is that having it in a directory is better, but doing so means we need to have a very manual setup and lose the efficiencies/functionality of wordpress.
-
No opinion here.
In late 2010 we redirected to popular subdomains to folders in the root. The results have been kickass. Kickass.
-
Here is a post from earlier in the year with a similar discussion (didn't see that one before I posted this). Also looks like similar differences of opinion, though some more sources sited. http://www.seomoz.org/q/corporate-blog
Because of the lack of consensus, I'm curious to research more. Just want to make sure I/we didn't miss anything over the past few months.
-
The problem with this idea, it occurs to me on second thoughts, will be comments. Having dynamically user-generated content will be tricky with this workaround.
Aside from that, rsync and W3TC are both enterprise-level stable solutions, so it SHOULD work - but I agree, it's doing something new, and new's always a bit risky.
Would you be able to go into any detail as to why you can't host WP? Is it a hosting company issue, a platform language issue, or something else?
-
James: do you have a source for the statement that Google now treats subs as a key site element?
-
Interesting. I need to research this more. It sounds like it's prone to errors, but maybe not.
-
If I could not have the blog that is going to receive massive work in a subfolder I would be looking for a different platform for the site or a different method of creating the blog.
Placing that blog on a subdomain or on a satelite site is like tossing away great content imo.
-
Google now treats sub domains as a key element of the site
[citation needed]
Though I know you're talking about - http://googlewebmastercentral.blogspot.com/2011/08/reorganizing-internal-vs-external.html
However, as far as I'm aware, there's no information yet as to how Google are changing the weighting of these links (or even if they are), so I'd still be wary of charging ahead with a subdomain
-
Can you rsync or otherwise automatically copy content onto your primary web server? If so, there may be a way to combine the best of all worlds.
Set up your Wordpress platform somewhere else - doesn't matter where. Make sure Google isn't crawling it to avoid duplicate content penalties.Install a caching solution like W3 Total Cache which writes the entire blog as static HTML to the disk.
Now, have a frequently-updating automatic synchronisation tool copy those files from the location on your blog server to the local directory on your web server corresponding to yourdomain.com/blog . Set up the same rewrite rules on your main server as W3TC uses on your blog server.
You should now have an automatically-updated static copy of your blog hosted under yourdomain.com/blog . As a bonus, it'll be fast as hell and stable as a large room full of horses.
The actual setup's a bit of a faff, but my (non-pro SEO) intuition is that it'll be the best solution SEO-wise.
-
Thanks for the input, James. Agreed on the external site. I didn't know about subs being treated as a key element now. So other sites linking to posts on the blog (if the blog is on blog.site.com) will still benefit the primary domain?
Having it in a folder is doable, but more difficult to manage ongoing. I think it's a question of 'how much better' is it to have at site.com/blog...
-
If you can not get it onto a sub folder ie site.com/blog then the next best is to have it on a sub domain blog.site.com
Google now treats sub domains as a key element of the site, yet sub folders work better for internal linking.
I would not put it on an external site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is AMP works on blogs only?
I have installed AMP Plugin in my WordPress website but when I check pages with /amp/ it shows 404 error. But for blog pages, for the example www.website.com/blog/post/amp/ it shows amp version of the particular page. Also, nothing is showing in search console Accelerate Moile pages.
Intermediate & Advanced SEO | | SEO-Stephanie0 -
Not sure how we're blocking homepage in robots.txt; meta description not shown
Hi folks! We had a question come in from a client who needs assistance with their robots.txt file. Metadata for their homepage and select other pages isn't appearing in SERPs. Instead they get the usual message "A description for this result is not available because of this site's robots.txt – learn more". At first glance, we're not seeing the homepage or these other pages as being blocked by their robots.txt file: http://www.t2tea.com/robots.txt. Does anyone see what we can't? Any thoughts are massively appreciated! P.S. They used wildcards to ensure the rules were applied for all locale subdirectories, e.g. /en/au/, /en/us/, etc.
Intermediate & Advanced SEO | | SearchDeploy0 -
Duplicate Titles caused by blog
Hey I've done some research and understand the canonical tags and rel prev and rel next, but I wanted to get someones opinion on if we needed it since the articles are somewhat independent of each in content (there's a focus on both banks and accountants) We have over 68 pages of blog materials http://www.sageworks.com/blog/default.aspx?page=7 through http://www.sageworks.com/blog/default.aspx?page=68 Thanks in advance for your help!
Intermediate & Advanced SEO | | josh1230 -
What to do when all products are one of a kind WYSIWYG and url's are continuously changing. Lots of 404's
Hey Guys, I'm working on a website with WYSIWYG one of a kind products and the url's are continuously changing. There are allot of duplicate page titles (56 currently) but that number is always changing too. Let me give you guys a little background on the website. The site sells different types of live coral. So there may be anywhere from 20 - 150 corals of the same species. Each coral is a unique size, color etc. When the coral gets sold the site owner trashes the product creating a new 404. Sometimes the url gets indexed, other times they don't since the corals get sold within hours/days. I was thinking of optimizing each product with a keyword and re-using the url by having the client update the picture and price but that still leaves allot more products than keywords. Here is an example of the corals with the same title http://austinaquafarms.com/product-category/acans/ Thanks for the help guys. I'm not really sure what to do.
Intermediate & Advanced SEO | | aronwp0 -
Cooking Recipes Blog Links
Hi, I am running an ecommerce store - cookware, bakeware, knives etc... I have someone I know personally that is a writer and one of her blogs is about cooking - lots of well established articles with keywords througout. Is there any harm in getting some inbound links from her blog on certain keywords? If so, should I limit the number of outgoing links per article she has? Any guidelines? Thanks!
Intermediate & Advanced SEO | | bjs20100 -
Best url structure
I am making a new site for a company that services many cities. I was thinking a url structure like this, website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5. Will this be the best approach to optimize the site for the keyword plus 5 different cities ? as long as I keep the total url characters under the SeoMoz reccomended 115 characters ? Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.
Intermediate & Advanced SEO | | jlane90 -
Competitior 'scraped' entire site - pretty much - what to do?
I just discovered a competitor in the insurance lead generation space has completely copied my client's site's architecture, page names, titles, even the form, tweaking a word or two here or there to prevent 100% 'scraping'. We put a lot of time into the site, only to have everything 'stolen'. What can we do about this? My client is very upset. I looked into filing a 'scraper' report through Google but the slight modifications to content technically don't make it a 'scraped' site. Please advise to what course of action we can take, if any. Thanks,
Intermediate & Advanced SEO | | seagreen
Greg0 -
Why is my site's 'Rich Snippets' information not being displayed in SERPs?
We added hRecipe microformats data to our site in April and then migrated to the Schema.org Recipe format in July, but our content is still not being displayed as Rich Snippets in search engine results. Our pages validate okay in the Google Rich Snippets Testing Tool. Any idea why they are not being displayed in SERP's? Thanks.
Intermediate & Advanced SEO | | Techboy0