Server Vs Authority
-
Deciding on whether to go for a Sub directory or CC tld structure.
So the tradeoff would be one server location (which can affect local rankings if the server is outside the country) VS a better passing of link authority.
What factor is more important?
-
ccTLD and Sub directories both are valid but it is preferable to use subdirectory instead of ccTLD.
By the way duplicate content can be a huge issue in that case weather you go for anything ccTLD, sub directory and even the different domain.
I would highly recommend using the sub directory with a unique text in it! Or use the different language (same content under different language won’t be a problem!)
-
Thanks Rebekah,
All in the same language with very similar content. Wouldnt the duplicate content be an issue for both strategies?
-
Will the content be in a different language? If so I would recommend the sub-directory. If not, the TLD. Personal preference, Google has said you could use either - but I can see a lot of duplicate content issues if the sub-directory is in the same language.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 Redirect keep html files on server?
Hello just one quick question which came up in the discussion here: http://moz.com/community/q/take-a-good-amount-of-existing-landing-pages-offline-because-of-low-traffic-cannibalism-and-thin-content When I do 301 redirects where I put together content from 2 pages, should I keep the page/html which redirects on the server? Or should I delete? Or does it make no difference at all?
Technical SEO | | _Heiko_0 -
"non-WWW" vs "WWW" in Google SERPS and Lost Back Link Connection
A Screaming Frog report indicates that Google is indexing a client's site for both: www and non-www URLs. To me this means that Google is seeing both URLs as different even though the page content is identical. The client has not set up a preferred URL in GWMTs. Google says to do a 301 redirect from the non-preferred domain to the preferred version but I believe there is a way to do this in HTTP Access and an easier solution than canonical.
Technical SEO | | RosemaryB
https://support.google.com/webmasters/answer/44231?hl=en GWMTs also shows that over the past few months this client has lost more than half of their backlinks. (But there are no penalties and the client swears they haven't done anything to be blacklisted in this regard. I'm curious as to whether Google figured out that the entire site was in their index under both "www" and "non-www" and therefore discounted half of the links. Has anyone seen evidence of Google discounting links (both external and internal) due to duplicate content? Thanks for your feedback. Rosemary0 -
Can anyone speak to the pros and cons of installing mod_expire on an Apache server?
We recently had mod_deflate and mod_expire installed on our server in an attempt to improve pagespeed. They worked beautifully, at least we thought they did. Google's pagespeed insights tools evaluated our homepage at 65 before the install and 90 after...major improvement. However, we seem to be experiencing very slow load on our product pages. There is a feeling (not based on any quantifiable data) that mod_expire is actually slowing down our page load, particularly for visitors who do not have the page cached (which would probably be most visitors). Here are some pages to look at with their corresponding score from the Pagespeed Insights tool: Live Sound - 91 http://www.ccisolutions.com/StoreFront/category/live-sound-live-audioWireless Microphones - 90 http://www.ccisolutions.com/StoreFront/category/microphones Truss and Rigging - 79 http://www.ccisolutions.com/StoreFront/category/lighting-truss light weight product detail page 83 http://www.ccisolutions.com/StoreFront/product/global-truss-sq-4109-12-truss-segment heavy weight product detail page 77 http://www.ccisolutions.com/StoreFront/product/presonus-studiolive-16-4-2 Any thoughts from my fellow Mozzers would be greatly appreciated!
Technical SEO | | danatanseo1 -
If a permanent redirect is supposed to transfer SEO from the old page to the new page, why has my domain authority been impacted?
For example, we redirected our old domain to a new one (leaving no duplicate content on the old domain) and saw a 40% decrease in domain authority. Isn't a permanent redirect supposed to transfer link authority to the place it is redirecting to? Did I do something wrong?
Technical SEO | | BlueLinkERP0 -
Does server affect indexing speeds?
A bit of a strange question this one: I have a domain which, when on my Dutch server, can get new blog posts indexed and ranking in less than 10 mins using the pubsubshubbub plugin. However, I moved the blog and domain to a UK dedicated server and continued to post. Days later none of these posts were indexed. I then moved the domain back to the Dutch server to test this, I posted in the blog and once again, indexed and ranking in 20 mins or so. To cut a long and tedious story short; In a bid to be closer to my customers I moved the domain to a UK VPS three days back. I posted but no posts are indexed. Anyone else experienced anything like this? Generally I don't move domains back and forward so much but wanted to test this out. The Ducth server is a 16 core 24gb Direct Admin dedicated, the two UK servers were both running Cpanel. I understand that it would be best to host as close to possible to the customers but the hardship of getting posts indexed in the UK is becoming a problem. Thanks, Carl
Technical SEO | | Grumpy_Carl1 -
Should I be using use rel=author in this case?
We have a large blog, which it appears one of our regional blogs (managed separately) is simply scraping content off of our blog and adding it to theirs. Would adding rel=author (for all of our guest bloggers) help eliminate google seeing the regional blog content as scraped or duplicate? Is rel=author the best solution here?
Technical SEO | | VistageSEO0 -
Is there an easier way from the server to prevent duplicate page content?
I know that using either 301 or 302 will fix the problem of duplicate page content. My question would be; is there an easier way of preventing duplicate page content when it's an issue with the URL. For example: URL: http://example.com URL: http://www.example.com My guess would be like it says here, that it's a setting issue with the server. If anyone has some pointers on how to prevent this from occurring, it would be greatly appreciated.
Technical SEO | | brianhughes2 -
For Google + purposes, should the author's name appear in the Meta description or title tag of my web site just as you would your key search phrase?
Relative to Cyrus Shepard's article on January 4th regarding Google's Superior SEO strategy, if I'm the primary author of all blog articles and web site content, and I have a link showing authorship going back to Google Plus, is a site wide link from the home page enough or should that show up on all blog posts etc and editorial comment pages etc? Conversely, should the author's name appear in the Meta description or title tag of my web site just as you would your key search phrase since Google appears to be trying to make a solid connection with my name, and all content?
Technical SEO | | lwnickens0