International SEO and server hosting
-
I'd appreciate feedback on a situation. We're going through a major overhaul in how we globally manage our websites.
Regional servers were part of our original plan (one in Chicago, UK, and APAC) but we've identified a number of issues with this approach. Although it's considered a best practice among many, the challenges we'd face doing it are considerable (added complexity, added steps and delays to updating sites, among others).
So, we shifted our plan and how are looking at hosting here in the US but to use Akami to deliver images and other heavier data pieces from their local servers (in the UK, etc.). This is how many of the larger companies like Amazon, etc. delivery their global websites.
We hope that using Akami will allow us to have good performance while simplifying our process. Any warning signs we should be aware of? Anyone doing it this way and has a good experience/bad experience?
-
Gerd knows a lot more about CDNs than I do
Yes, you absolutely need to have the CDN content appear as your own subdomain. Standard SEO applies for your image and video content optimization to make sure the content which is now sitting on the subdomain (not your TLD) gets indexed properly.
-
Make sure that your CDN services provide you with domain aliasing - for example if your domain is www.example.com you want your CDN services host-name be part of the domain - i.e. cdnuk.example.com for the UK region.
You will then at least get some value from image crawlers etc. Don't go for any CDN service which does not allow your content to resolve to a subdomain of your primary domain.
SEO does play a role though as the speed of the CDN will affect your overal pagespeed and will also affect how much content a bot will be able to crawl within your allocated crawl quota. The faster your load-time/CDN the more content will be crawled.
I would not bother with localisation tags if your main objective is to optimise performance / page-load time based on your users geo-location.
It looks like you set your mind on Akami, but I would perhaps also evaluate Amazon S3/Cloudfront or Rackspace as those service deliver the same level of SLA but might be more cost-effective for your purposes.
Get your CDN provides to give you a 1-2 month free proof-of-concept (they will only offer this if your traffic is substantial) so that you can try out the service. Never sign up for contracts longer than 12 months, and only sign an annual contract if you receive a large discount. Most CDN companies will charge you for 10 months when signing up for an annual contract.
Also ensure that your CDN provider gives you (near-) or preferably real-time access to statistics and performance reports (you want to see how many requests/sec they have served and what the speed was.
Test your site / CDN via tools such as webpagetest.org or pingdom.com - they have POPs across the globe to simulate remote tests.
-
Thanks for confirming!
-
You don't need to do this anymore. Google uses other signals now to determine what region you should appear in. They understand that someone may choose to host a site in the US rather than some small country for reliability reasons. Just geo-target your sites and you will be fine.
s) and language tag
b) proper language for that region
c) add your local address and contact information to your footer globally if possible
d) geo-target in WMT
Sites like amazon serve their heavier data pieces locally for performance issues, not for SEO.
Same rules apply though with interlinking same owned sites sitting on the same server though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Breadcrumbs and internal links
Hello, I use to move up my site structure with links in content. I have now installed breadcrumbs, is it is useful to still keep the links in content or isn't there a need to duplicate those links ? and are the breadcrumbs links enough. Thank you,
Intermediate & Advanced SEO | | seoanalytics1 -
Multi-Store SEO
I am currently developing a website which will have a multi-store function, i.e. one for US & ROW customers and one for UK & EU customers. The domain names will be along the lines of: Original domain: www.website.com UK & EU domain: eu.website.com US & ROW domain: us.website.com When a customer visits the website they will be redirected to one or the other depending on their location. Can anyone see any problems which this may cause in respect to SEO? I know there may be a duplicate content issue here also, how should I best deal with this?
Intermediate & Advanced SEO | | moon-boots0 -
Dealing with negative SEO
Interested to know people strategies for detecting and mitigating negative SEO. Previously I've used link monitoring tool and kept an eye on all new back links coming in to any page on the site. I have then manually assessed each one again using some tools and actually visiting the website. However, this always leaves me with one dilemma. Regardless of my assessment how do search engines see that link? I run three lists a white list, grey list and blacklist. White list - very relevant and have a lot of authority. I.e. leading industry blogs and forums. Grey list - out of topic/industry, directories Blacklist - sites de-indexed by Google, illegal content or absolute spam (i.e. one page filled with hundreds of links to different domains) Do you have any thoughts? How do you assess if link is bad?
Intermediate & Advanced SEO | | seoman100 -
AJAX requests and implication for SEO
Hi, I got a question in regard to webpages being served via AJAX request as I couldn't find a definitive answer in regard to an issue we currently face: When visitors on our site select a facet on a Listing Page, the site doesn't fully reload. As a consequence only certain tags of the content (H1, description,..) are updated, while other tags like canonical URLs, meta noindex,nofollow tag, or the title tag are not updating as long as you don't refresh the page. We have no information about how this will be crawled and indexed yet but I was wondering if anyone of you knows, how this will impact SEO?
Intermediate & Advanced SEO | | FashionLux0 -
Onsite SEO vs Offsite SEO
Hey I know the importance of both onsite & offsite, primarily with regard to outreach/content/social. One thing I am trying to determine at the moment, is how much do I invest in offsite. My current focus is to improve our onpage content on product pages, which is taking some time as we have a small team. But I also know our backlinks need to improve. I'm just struggling on where to spend my time. Finish the onsite stuff by section first, or try to do a bit of both onsite/offsite at the same time?
Intermediate & Advanced SEO | | BeckyKey1 -
SEO Impact of High Volume Vertical and Horizontal Internal Linking
Hello Everyone - I maintain a site with over a million distinct pages of content. Each piece of content can be thought of like a node in graph database or an entity. While there is a bit of natural hierarchy, every single entity can be related to one or more other entities. The conceptual structure of the entities like so: Agency - A top level business unit ( ~100 pages/urls) Office - A lower level business unit, part of an Agency ( ~5,000 pages/urls) Person - Someone who works in one or more Offices ( ~80,000 pages/urls) Project - A thing one or more People is managing ( ~750,000 pages/urls) Vendor - A company that is working on one or more Projects ( ~250,000 pages/urls) Category - A descriptive entity, defining one or more Projects ( ~1,000 pages/urls) Each of these six entities has a unique (url) and content. For each page/url, there are internal links to each of the related entity pages. For example, if a user is looking at a Project page/url, there will be an internal link to one or more Agencies, Offices, People, Vendors, and Categories. Also, a Project will have links to similar Projects. This same theory holds true for all other entities as well. People pages link to their related Agencies, Offices, Projects, Vendors, etc, etc. If you start to do the math, there are tons of internal links leading to pages with tons of internal links leading to pages with tons of internal links. While our users enjoy the ability to navigate this world according to these relationships, I am curious if we should force a more strict hierarchy for SEO purposes. Essentially, does it make sense to "nofollow" all of the horizontal internal links for a given entity page/url? For search engine indexing purposes, we have legit sitemaps that give a simple vertical hierarchy...but I am curious if all of this internal linking should be hidden via nofollow...? Thanks in advance!
Intermediate & Advanced SEO | | jhariani2 -
Internal nofollows?
We have a profile page on our site for members who join. The profile page has child pages that are simply more specific drill-downs of what you get on the main profile page. For example: /roger displays all of roger's posts, questions, and favorites and then there are /roger/posts, /roger/questions, /roger/favorites. Since the child pages contain subsets of the content on the main profile page, we canonical them back to the main profile page. Here's my question: The main profile page has navigation links to take you to the child pages. On /roger, there are links to: /roger/posts, /roger/questions, and /roger/favorites. Currently, we nofollow these links. Is this the right way to do it? It seems to me that it's a mistake, since the bots will still crawl those pages but will not transfer PR. What should we do instead: 1. Make the links js links so the child pages won't be crawled at all? 2. Make the links follow so that PR will flow (see Matt Cutts' advice here)? Apprehension about doing this: won't it dilute crawl budget (as opposed to #1)? 3. Something else? In case the question wasn't confusing enough... here's another piece: We also have a child page of the profile that is simply a list of members (/roger/friends). Since this page does not have any real content, we are currently noindex/nofollow -ing it and the link to this page is also nofollow. I'm thinking that there's a better solution for this as well. Would love your input!
Intermediate & Advanced SEO | | YairSpolter0 -
Flow of internal link equity
I've recently come across this: A site changes the URL of one internal page to something more search friendly, and 301's the old to the new as you would expect. They don't change the link on the homepage in the navigation. Instead they keep it to the old URL so they go through the 301 to get to the page even though it's internal. They say if they change the URL it will reset the internal flow of link equity to that page. I've not come across this before and so am not sure what to think. I mean I can see what they're saying but I would have though that it being internal would mean it's different and that the flow to internal pages would just kind of resume as-was quite soon afterwards. Any views?
Intermediate & Advanced SEO | | SteveOllington0