Content Delivery Network
-
Anyone have a good reference for implementing a content delivery network? Any SEO pitfalls with using a CDN (brief research seems to indicate no problems)?
I seem to recall that SEOmoz was using Amazon Web Services (AWS) for CDN. Is that still the case?
All CDN & AWS experiences, advice, references welcomed!
-
Thanks.
-
Cloud Flare is the easiest I have found. MaxCDN is another I have heard of. And other hosting companies that may be in geographic regions you want your site to be close to can provide services as well.
-
You shouldn't have any issues, as long as you do it correctly (use CNAMES, etc.)
Faster site speed (if you actually use a fast CDN) will also have a positive SEO impact (although we don't know how much weight this is being given.)
I believe SEOMoz is still using CloudFront.
If you want good third-party data on CDN performance, check CloudClimate or Cedexis.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | | znotes0 -
Do mobile and desktop sites that pull content from the same source count as duplicate content?
We are about to launch a mobile site that pulls content from the same CMS, including metadata. They both have different top-level domains, however (www.abcd.com and www.m.abcd.com). How will this affect us in terms of search engine ranking?
Technical SEO | | ovenbird0 -
Quickview popup duplicate content
Hi We have an eccomerce site. We just added to the product list view a quickview tab - when you roll mouse over it a popup window with the product image and short description shows up - is this a problem of duplicate content( its the same content that's on the product pages except there we also have a long detailed description) - t is done with javascript. Thanks!
Technical SEO | | henya0 -
Help With Joomla Duplicate Content
Need another set of eyes on my site from someone with Joomla experience. I'm running Joomla 2.5 (latest version) and SEOmoz is giving my duplicate content errors on a lot of my pages. I checked my sitemap, I checked my menus, and I checked my links, and I can't figure out how SEOmoz is finding the alternate paths to my content. Home page is: http://www.vipfishingcharters.com/ There's only one menu at the top. Take the first link "Dania Beach" under fishing charters for example. This generates the SEF url: http://www.vipfishingcharters.com/fishing-charters/broward-county/dania-beach-fishing-charters-and-fishing-boats.html Somehow SEOmoz (and presumably all other robots) are finding duplicate content at: http://www.vipfishingcharters.com/broward-county/dania-beach-fishing-charters-and-fishing-boats.html SEOmoz says the referrer is the homepage/root. The first URL is constructed using the menu aliases. The second one is constructed using the Joomla category and article alias. Where is it getting this and how can I stop it? <colgroup><col width="601"></colgroup>
Technical SEO | | NoahC0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240 -
Displaying static content - risky?
In an attempt to improve the speed of our site, We have installed Cache_Lite extension for PHP. Its a PEAR based system which converts dynamic pages into static pages. The system is set to delete the temp files every 15 minutes, at which point any changes / new content will appear on the system. I don't see any risk of doing this but thought it safe to double check if there could be any impact on Google by doing it?
Technical SEO | | ukss19840 -
Duplicate homepage content
Hi, I recently did a site crawl using seomoz crawl test My homepage seems to have 3 cases of duplicate content.. These are the urls www.example.ie/ www.example..ie/%5B%7E19%7E%5D www.example..ie/index.htm Does anyone have any advise on this? What impact does this have on my seo?
Technical SEO | | Socialdude0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0