Using a third party server to host site elements
-
Hi guys -
I have a client who are recently experiencing a great deal of more traffic to their site. As a result, their web development agency have given them a server upgrade to cope with the new demand.
One thing they have also done is put all website scripts, CSS files, images, downloadable content (such as PDFs) - onto a 3rd party server (Amazon S3). Apparently this was done so that my clients server just handles the page requests now - and all other elements are then grabbed from the Amazon s3 server. So basically, this means any HTML content and web pages are still hosted through my clients domain - but all other content is accessible through an Amazon s3 server URL.
I'm wondering what SEO implications this will have for my clients domain? While all pages and HTML content is still accessible thorugh their domain name, each page is of course now making many server calls to the Amazon s3 server through external URLs (s3.amazonaws.com).
I imagine this will mean any elements sitting on the Amazon S3 server can no longer contribute value to the clients SEO profile - because that actual content is not physically part of their domain anymore. However what I am more concerned about is whether all of these external server calls are going to have a negative effect on the web pages value overall. Should I be advising my client to ensure all site elements are hosted on their own server, and therefore all elements are accessible through their domain?
Hope this makes sense (I'm not the best at explaining things!)
-
Hello Zeal Digital,
I use a CDN (Content Delivery Network) for images, CSS and javascript.
Doing that adds only about $10 to cost per month for a site that had around 800,000 pageviews per month.
You have complete control over the images. If there is a problem, you can force the CDN to flush a file and reload it from the source. You add code to your .htaccess file that tells the CDN how long to store images before fluching them and getting a new copy. It is all automated, there is generally no work for you to do. I host with softlayer.com and this is part of their service.
The change from self-sourced images, css and scripts had a massive improvement on the server.
- it is a 16-processor linux box with twin 15,000rpm SCSI drives and 12Gb RAM - it is quite fast!
Page delivery times improved by 1-2 seconds.
The server now is so lightly loaded that it could be downgraded to save more money.
It has zero effect on SEO. The CDN is accessed using a CNAME.
- static.domain.com - so don't worry about it looking like components are from other places.
The CDN has servers all over the world, so no matter where the visitors are, it is only a few hops for them to get most of the content, making it much faster for someone in Australia who would normally pull images from a server in the USA.
Your only problem with Amazon S3 is that they have crashed it a few times, but other than that, it is a good thing to do.
I wouldn't advise them to self-host, unless you want to increase their costs, server loading and page delivery times.
-
Great advice, cheers Jeffery!
-
I work with a number of high traffic sites (TB's of data each day, 10's millions page views/month). With many of these sites, we have offloaded static content to either dedicated static content servers (typically cloud based so we can scale up and down) or to content deliver networks. I've not had anyone report any SEO impact.
In contrast, they often see user engagement (page views/user), repeat visitors, and other traffic metrics improve. Users like fast sites. Also, Google apparently likes fast sites too, so while I've not seen it, you could actually get a boost in your SERPs due to faster loading pages.
If you break down a modern web page, you will find numerous elements required. Dozens of images, CSS, javascript as well as the page itself. All of these items require a request to the web server.
With some graphic intensive sites, I've seen as much as 95% of all web server requests (HTTP requests) be attributable to static content. By moving these HTTP requests to other systems, you free your primary server to handle the application. This provides a better user experience and improves scalability.
Content Delivery Networks
I do not use Amazon's Web Services so I do not know specifically what they offer. But here are two CDN's Ihave used with good success:
Internap:
http://www.internap.com/cdn-services-content-delivery-network/
Edgecast:
One method I look for is called "origin pull." With this method, you do not have to upload files to the CDN. The CDN will fetch them automatically from your site as needed. I found this is much easier to manage on sites that have frequent content updates.
-
Hosting images externally never had any impact on cases I had a chance to observe. The only problem I can think of is that you lose control over loading times or if somebody takes an image and links (credits) the image hosting domain instead of your domain.
-
Couple of notes for you
- There isn't any SEO impact on WHERE the data is loaded from. Look at any major website (especially one that ranks well) and they're openly using content delivery (like Akamai, Amazon S3/Cloudfront, etc) for static content. This is good business practice because it takes that load off your web server and often places the content closer to where the client is. Faster content delivery can help SEO if you have a slow server.
- If they're using the raw S3 buckets I would HIGHLY suggest signing up for Cloudfront. There's two benefits to doing this. First, you put the content into Amazon's cloud, where it is more readily available. Second, you can use domain aliasing to help obscure the source. For instance, let's say you have an images bucket. You could add a CNAME DNS record for images.yourdomain.com and then put that into your source code. You can still see where the DNS takes you, but it's not obvious to the general public. The cost difference between raw S3 delivery and Cloudfront is negligible.
Oh, and I use Amazon Cloudfront for my delivery. Never had any SEO issues with doing so.
-
I don't recomend to have the resources and database to other server than files, it makes some flood traffic between servers, the resources are harder to load and the site optimum speed is decreased. Also you can't compress this content so they are downloaded independently.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Not Being Indexed
Hey Everyone - I have a site that is being treated strangely by google (at least strange to me) The site has 24 pages in the sitemap - submitted to WMT'S over 30 days ago I've manually triggered google to crawl the homepage and all connecting links as well and submitted a couple individually. Google has been parked the indexing at 14 of the 24 pages. None of the unindexed URL's have Noindex or follow tags on them - they are clearly and easily linked to from other places on the site. The site is a brand new domain, has no manual penalty history and in my research has no reason to be considered spammy. 100% unique handwritten content I cannot figure out why google isn't indexing these pages. Has anyone encountered this before? Know any solutions? Thanks in advance.
Technical SEO | | CRO_first0 -
Representing categories on my site
My site serves a consumer-focused industry that has about 15-20 well recognized categories, which act as a pretty obvious way to segment our content. Each category supports it's own page (with some useful content) and a series of articles relevant to that category. In short, the categories are pretty focal to what we do. I am moving from DNN to WordPress as my CMS/blog. I am taking the opportunity to review and fix SEO-related issues as I migrate. One such area is my URL structure. On my existing site (on DNN), I have the following types of pages for each topic: / <topic>- this is essentially the landing page for the topic and links to articles</topic> /<topic>/articles/ <article-name>- topics have 3-15 articles with this URL structure</article-name></topic> With WordPress, I am considering moving to articles being under the root. So, an article on (making this up) how to make a widget would be under /how-to-make-a-widget, instead of /<widgets>/article/how-to-make-a-widget I will be using WordPress categories to reflect the topics taxonomy, so I can flag my articles using standard WordPress concepts.</widgets> Anyway, I'm trying to get my head around whether it makes sense to "flatten" my URL structure such that the URLs for each article no longer include the topic (the article page will link to the topic page though). Thoughts?
Technical SEO | | MarkWill1 -
Server Connectivity
Hey there When we go to our webmaster tools there is a orange triangle. The issue is that Google's robot can not access our site. Does anyone know why this could be? Thanks!
Technical SEO | | Comunicare0 -
How can you get the right site links for your site?
Hello all, I have been trying to get Google to list relevant site links for my site when you type in our brand name, Loco2 or for when Loco2 comes up in a search result. Different things come up when you search Loco2 and Loco 2. We would like site links to look like how they do when you search Loco 2. However Loco2 is our brand name, NOT Loco 2. Does anyone know why Google is doing this and whether we can influence results? We have done as much as possible via Google webmaster, in terms of specifying the links we DO NOT want Google to list for Loco2. However, when you search "Loco2", results only show simple site links. Ideally what we want is: Loco2 to be recognised as the brand NOT Loco 2 The same results (substantial, identical) for Loco2 as for Loco 2 (think o2 and o 2) For the site links to reflect the main pages of our site (Times & Tickets, Engine Room forum etc.) Many thanks in advance! Anila
Technical SEO | | anilababla0 -
Want to Target Mobile site for Google Mobile Version and Desktop Site for Google Desktop Version
I have ecommerce site with both mobile version and desktop version. Mobile version starts with m.example.com and full version starts with www.example.com I am using same content through out both site and using 301 redirection by detecting user agent vice-versa. My both sites are accessible to crawl by any google spider. I have submitted both sites's sitemap to GWT and mobile site having mobile sitemap xml, so google can easily recognize my mobile site. Is it going to help to rank my both sites as per my expectation? I need to rank for mobile site in Google mobile and ranking for desktop site in Google desktop version. Some of pages of my mobile site are started to appearing in Google desktop version. So how I can stop them to appear in Google desktop? Your comments are highly welcome.
Technical SEO | | Hexpress0 -
How do you diagnose if on your site is only 50% crawled?
Good Morning from 7 degrees C, goodbye arctic conditions wetherby UK, If a site had 100 pages for example & that site was plugged into Webmaster Tools how could you diagnose if all the pages had been crawled? The thing is I want to learn how to diagnose crawl issues with sites, is their a known methodology for this? Thanks in advance, David
Technical SEO | | Nightwing0 -
How do I set up a site review for a password protected site?
We need to conduct a SEO analysis for a website that is on a private, password protected development site -- is there anyway for SEOMoz tools to access and analyze a PW protected site? Thank you, Sara Merten
Technical SEO | | kev110