Using a third party server to host site elements
-
Hi guys -
I have a client who are recently experiencing a great deal of more traffic to their site. As a result, their web development agency have given them a server upgrade to cope with the new demand.
One thing they have also done is put all website scripts, CSS files, images, downloadable content (such as PDFs) - onto a 3rd party server (Amazon S3). Apparently this was done so that my clients server just handles the page requests now - and all other elements are then grabbed from the Amazon s3 server. So basically, this means any HTML content and web pages are still hosted through my clients domain - but all other content is accessible through an Amazon s3 server URL.
I'm wondering what SEO implications this will have for my clients domain? While all pages and HTML content is still accessible thorugh their domain name, each page is of course now making many server calls to the Amazon s3 server through external URLs (s3.amazonaws.com).
I imagine this will mean any elements sitting on the Amazon S3 server can no longer contribute value to the clients SEO profile - because that actual content is not physically part of their domain anymore. However what I am more concerned about is whether all of these external server calls are going to have a negative effect on the web pages value overall. Should I be advising my client to ensure all site elements are hosted on their own server, and therefore all elements are accessible through their domain?
Hope this makes sense (I'm not the best at explaining things!)
-
Hello Zeal Digital,
I use a CDN (Content Delivery Network) for images, CSS and javascript.
Doing that adds only about $10 to cost per month for a site that had around 800,000 pageviews per month.
You have complete control over the images. If there is a problem, you can force the CDN to flush a file and reload it from the source. You add code to your .htaccess file that tells the CDN how long to store images before fluching them and getting a new copy. It is all automated, there is generally no work for you to do. I host with softlayer.com and this is part of their service.
The change from self-sourced images, css and scripts had a massive improvement on the server.
- it is a 16-processor linux box with twin 15,000rpm SCSI drives and 12Gb RAM - it is quite fast!
Page delivery times improved by 1-2 seconds.
The server now is so lightly loaded that it could be downgraded to save more money.
It has zero effect on SEO. The CDN is accessed using a CNAME.
- static.domain.com - so don't worry about it looking like components are from other places.
The CDN has servers all over the world, so no matter where the visitors are, it is only a few hops for them to get most of the content, making it much faster for someone in Australia who would normally pull images from a server in the USA.
Your only problem with Amazon S3 is that they have crashed it a few times, but other than that, it is a good thing to do.
I wouldn't advise them to self-host, unless you want to increase their costs, server loading and page delivery times.
-
Great advice, cheers Jeffery!
-
I work with a number of high traffic sites (TB's of data each day, 10's millions page views/month). With many of these sites, we have offloaded static content to either dedicated static content servers (typically cloud based so we can scale up and down) or to content deliver networks. I've not had anyone report any SEO impact.
In contrast, they often see user engagement (page views/user), repeat visitors, and other traffic metrics improve. Users like fast sites. Also, Google apparently likes fast sites too, so while I've not seen it, you could actually get a boost in your SERPs due to faster loading pages.
If you break down a modern web page, you will find numerous elements required. Dozens of images, CSS, javascript as well as the page itself. All of these items require a request to the web server.
With some graphic intensive sites, I've seen as much as 95% of all web server requests (HTTP requests) be attributable to static content. By moving these HTTP requests to other systems, you free your primary server to handle the application. This provides a better user experience and improves scalability.
Content Delivery Networks
I do not use Amazon's Web Services so I do not know specifically what they offer. But here are two CDN's Ihave used with good success:
Internap:
http://www.internap.com/cdn-services-content-delivery-network/
Edgecast:
One method I look for is called "origin pull." With this method, you do not have to upload files to the CDN. The CDN will fetch them automatically from your site as needed. I found this is much easier to manage on sites that have frequent content updates.
-
Hosting images externally never had any impact on cases I had a chance to observe. The only problem I can think of is that you lose control over loading times or if somebody takes an image and links (credits) the image hosting domain instead of your domain.
-
Couple of notes for you
- There isn't any SEO impact on WHERE the data is loaded from. Look at any major website (especially one that ranks well) and they're openly using content delivery (like Akamai, Amazon S3/Cloudfront, etc) for static content. This is good business practice because it takes that load off your web server and often places the content closer to where the client is. Faster content delivery can help SEO if you have a slow server.
- If they're using the raw S3 buckets I would HIGHLY suggest signing up for Cloudfront. There's two benefits to doing this. First, you put the content into Amazon's cloud, where it is more readily available. Second, you can use domain aliasing to help obscure the source. For instance, let's say you have an images bucket. You could add a CNAME DNS record for images.yourdomain.com and then put that into your source code. You can still see where the DNS takes you, but it's not obvious to the general public. The cost difference between raw S3 delivery and Cloudfront is negligible.
Oh, and I use Amazon Cloudfront for my delivery. Never had any SEO issues with doing so.
-
I don't recomend to have the resources and database to other server than files, it makes some flood traffic between servers, the resources are harder to load and the site optimum speed is decreased. Also you can't compress this content so they are downloaded independently.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I Block https URLs using Host directive in robots.txt?
Hello Moz Community, Recently, I have found that Google bots has started crawling HTTPs urls of my website which is increasing the number of duplicate pages at our website. Instead of creating a separate robots.txt file for https version of my website, can I use Host directive in the robots.txt to suggest Google bots which is the original version of the website. Host: http://www.example.com I was wondering if this method will work and suggest Google bots that HTTPs URLs are the mirror of this website. Thanks for all of the great responses! Regards,
Technical SEO | | TJC.co.uk
Ramendra0 -
Off-site company blog linking to company site or blog incorporated into the company site?
Kind of a SEO newbie, so be gentle. I'm a beginner content strategist at a small design firm. Currently, I'm working with a client on a website redesign. Their current website is a single page dud with a page authority of 5. The client has a word press blog with a solid URL name, a domain authority of 100 and page authority of 30. My question is this: would it be better for my client from an SEO perspective to: Re-skin their existing blog and link to the new company website with it, hopefully passing on some of its "Google Juice,"or... Create a new blog on their new website (and maybe do a 301 redirect from the old blog)? Or are there better options that I'm not thinking of? Thanks for whatever help you can give a newbie. I just want to take good care of my client.
Technical SEO | | TheKatzMeow0 -
Duplicate Content within Site
I'm very new here... been reading a lot about Panda and duplicate content. I have a main website and a mobile site (same domain - m.domain.com). I've copied the same text over to those other web pages. Is that okay? Or is that considered duplicate content?
Technical SEO | | CalicoKitty20000 -
Hosting Multiple Websites
I want host multiple websites in geo-specific locations, and also have them be unique from each other. Does anyone have suggestions of a software or another method to use for this? Any experience with http://multiplecloud.com?
Technical SEO | | theLotter0 -
Should I Use the Disavow Tool to for a Spammy Site/Landing Page?
Here's the situation... There's a site that is linking to about 6 articles of mine from about 225 pages of theirs (according to info in GWT). These pages are sales landing pages looking to sell their product. The pages are pretty much identical but have different urls. (I actually have a few sites doing this to me.) Here's where I think it's real bad -- when they are linking to me you don't see the link on the page, you have to view the page source and search for my site's url. I'm thinking having a hidden url, and it being my article that's hidden, has got to be bad. That on top of it being a sales page for a product I've seen traffic to my site dropping but I don't have a warning in GWT. These aren't links that I've placed or asked for in any way. I don't see how they could be good for me and I've already done what I could to email the site to remove the links (I didn't think it would work but thought I'd at least try). I totally understand that the site linking to me may not have any affect on my current traffic. So should I use the Disavow tool to make sure this site isn't counting against me?
Technical SEO | | GlenCraig0 -
Site is not displaying in Search Engines
My site is www.deoveritas.com it is in magento framework and it has a blog section in wordpress. When I enter Site:www.deoveroitas.com in google it shows all blog links in search result. The homepage and other innerpages are not getting displayed in search results at all. I even tried searching for www.deoveritas.com/about-us and it displays blogs in result. Checked Google webmaster fetch as google and it was index and successful. Can you please help me with this. Is my site de-indexed or banned by Google? the same issue is on Bing and Yahoo search engines too. Please help Thank you.
Technical SEO | | tpt.com0 -
Using Blogger.com
I have a client that is currently using blogger.com for their blog. I don't have much experience with this site as I have mostly used Wordpress in the past. Are there any good SEO plugins/tools for this site? Thanks!
Technical SEO | | AlightAnalytics0 -
Google not visiting my site
Hi my site www.in2town.co.uk which is a lifestyle magazine has gone under a major refit. I am still working on it but it should be ready by the end of this week or sooner but one problem i have is, google is not visiting the site. I took a huge gamble to redo the site, even though before the refit i was getting a few thousand visitors a day, i wanted to make the site better as i was getting google webmaster errors. But now it seems google is not visiting the site. for example i am using sh404sef and i have put friendly url in the site and on the home page it has its name and meta tag but when you look at google it is not giving the site a name. Also it has not visited the site since october 13th Can anyone advise how to encourage google to visit the site please.
Technical SEO | | ClaireH-1848860