Subdomains & CDNs
-
I've set up a CDN to speed up my domain. I've set up a CNAME to map the subdomain cdn.example.com to the URL where the CDN hosts my static content (images, CSS and JS files, and PDFs).
www.example.com and cdn.example.com are now two different IP addresses.
Internal links to my PDF files (white papers and articles) used to be www.example.com/downloads but now they are cdn.example.com/downloads
The same PDF files can be accessed at both the www and the cdn. subdomain. Thus, external links to the www version will continue to work.
Question 1: Should I set up 301 redirects in .htaccess such as:
Redirect permanent /downloads/filename.pdf http://cdn.example.com/downloads/filename.pdf
Question 2: Do I need to do anything else in my .htaccess file (or anywhere else) to ensure that any SEO benefit provided by the PDF files remains associated with my domain?
Question 3: Am I better off keeping my PDF files on the www side and off of the CDN?
Thanks,
Akira
-
On your cdn, you have an option to disallow se's to index files on subdomains by robots.txt so you can keep a copy on cdn.example.com as well, and noindex it via robots.txt for better user experience
-
Yup...keep it simple...where possible
-
There are about a dozen PDFs and they are indexed.
I agree that it's probably better to play it safe and keep these files on www.
Thanks!
-
How many PDF files are we talking ? Are they indexed ? Is there unique text content or images that have text that google has OCR'D ? If yes, I would suggest keeping them on www and therefore keep it simple. Again, it depends how many PDF files in number as well as their size in gigs would make it an easy answer.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dealing with Expired & Reoccurring Content At Scale
Hello, I have a question concerning maintenance & pruning content with a large site that has a ton of pages that are either expired OR reoccurring. Firstly, there's ~ 12,000 pages on the site. They have large sections of the site that have individual landing pages for time-sensitive content, such as promotions and shows. They have TONS of shows every day, so the # of page to manage keeps exponentially increasing. Show URLs: I'm auditing the show URLs and looking at pages that have backlinks. With those, I am redirecting to the main show pages.
Technical SEO | | triveraseo
-However, there are significant # of show URLs that are from a few years ago (2012, 2013, 2014, 2015) that DON'T get traffic or have any backlinks (or ranking keywords). Can I delete these pages entirely from the site, or should I go through the process of 410-ing them (and then deleting? or ...?)Can you let 410's sit?)? They are in the XML sitemap right now, so they get crawled, but are essentially useless, and I want to cut off the dead weight, but I'm worried about deleting a large # of pages from the site at once. For show URLs that are still obsolete, but rank well in terms of kewyords and get some traffic...is there any recommended option? Should I bother adding them to a past shows archive section or not since they are bringing in a LITTLE traffic? Or ax them since it's such a small amount of traffic compared to what they get from the main pages. There are URLs that are orphaned and obsolete right now, but will reoccur. For instance, when an artist performs, they get their own landing page, they may acquire some backlinks and rank, but then that artist doesn't come back for a few months. The page just sits there, orphaned and in the XML sitemap. However, regardless of back-links/keywords, the page will come back eventually. Is there any recommended way to maintain this kind of situation? Again, there are a LOT of URLs in this same boat. Promotional URLs: I'm going about the same process for promotions and thankfully, the scale of hte issue is much less. However, same question as above...they have some promotional URLs, like NYE Special Menu landing pages or Lent-Specials, etc, for each of their restaurants. These pages are only valid for a short amount of time each year, and otherwise, are obsolete. I want to reuse the pages each year, though, but don't want them to just sit there in the XML sitemap. Is there ever an instance where I might want to 302 redirect them, and then remove the 302 for the short amount of time they are valid? I'm not AS concerned about the recycled promotional URLs. There are much fewer URLs in this category. However, as you can probably tell, this large site has this problem of reoccurring content throughout, and I'd like to get a plan in place to clean it up and then create rules to maintain. Promotional URLs that reoccur are smaller, so if they are orphaned, not the end of the world, but there are thousands of show URLs with this issue, so I really need to determine the best play here. Any help is MUCH appreciated!0 -
Https & http
I have my website (HTTP://thespacecollective.com) marked on Google Webmaster Tools as being the primary domain, as opposed to https. But should all of my on page links be http? For instance, if I click the Home button on my home page it will take the user to http, but if you type in the domain name in the address bar it will take you to https. Could this be causing me problems for SEO?
Technical SEO | | moon-boots0 -
Webpages & Images Index Graph Gone Down Badly in Google Search Console Why?
Hello All, What is going on with Sitemap Index Status in Google Search Console :- Webpages Submitted - 35000 index showing 21000 whereas previously approx 34500 were index. Images Submitted - 85000 index showing - 11000 whereas previously approx 80000 were index. Whereas when I search in google site:abcd.com is it showing approx 27000 index for webpages. No message from google for penalty or warning etc.Please help.
Technical SEO | | wright3350 -
Rankings of Subdomains vs. Main Domain
Here's a puzzler... Our main domain (www.ides.com) doesn't appear in Google (but does on Bing and other engines). We think this is due to duplicate content which we're fixing. However our website's subdomains continue to appear prominently in SERPs, even on Google - here are some examples: IDES Prospector = prospector.ides.com IDES = support.ides.com Cycolac FR15 = catalog.ides.com Why would Google penalize a main domain and its subdomains?
Technical SEO | | Prospector-Plastics0 -
How is Google finding our preview subdomains?
I've noticed that Google is able to find, crawl and index preview subdomains we set up for new client sites (e.g. clientpreview.example.com). I know now to use "meta name="robots" and robots.txt) to block the search engines from crawling these subdomains. My question though, is how is Google finding these subdomains? We don't link to these preview domains from anywhere else, so I can't figure out how Google is even getting there. Does anybody have any insight on this?
Technical SEO | | ZeeCreative0 -
SEO basics for Q&A tool
Hi everyone, our company wants to launch a Q&A forum on our website. The goal is to keep the useres interacting with our website, generate leads (of course) and... last but not least... to generate UGC for our website (and Google of course)... [We organise career events with big companys for students, professionals, give career advice etc..] From a SEO perspective, I find the following points difficult to overcome: the possible problem of "thin" content, many URL's with a question and only 1 or 2 answers will not look good for Google, especially when there are a lot of it (Panda-Update). One solution could be to noindex pages with thin content, but imagine that you have an active community, this could take ages and we got other things to do... the problem of finding ALL content: what would be the best solution to make sure that G finds all UGC, even the older content? Would it be enough to link to older questions on the page of the actual question? Let's say, this page contains links to the 5 questions before and so on... Or should there be categories of questions, where you list all of the questions ever asked??? would you/can one optimise the content? Users do not ask questions with the beloved keywords and if there would be a standard solution that the URL and the Title-Tag contains the question, there could be a lot of strange/not useful pages on our domain... I hope I could make clear what my problems are and I hope someone can give me some good advice... Thanx!!
Technical SEO | | accessKellyOCG0 -
Microsite & Ducplicate Content Concern
I have a client that wants to put up a micro-site. It's not really even a niche micro-site, it's his whole site less a category and a few other pages. He is a plastic surgeon that offers cosmetic surgery services for the Face, Breast, and Body at his private practice in City A. He has partnered with another surgeon in City B who's surgical services are limited to only the Face. City B is nearby, but not so close that they consider themselves competitors for Facial surgery. The doctors agreement is that my client will perform only Breast and Body surgery at the City B location. He can market himself in City B (which he currently is not doing on his main site) but only for Breast and Body procedures and is not to compete for Facial surgery. Therefore, he needs this second site to not include content about Facial surgery. My concern is duplicate content. His request plan: the micro-site will be on different domain and C-block, the content, location keywords and meta data will be completely re-written and target City B. However, he wants to use the same theme of his main site - same source code, html/css, same top level navigation, same sub-navigation less the Face section, same images/graphics, same forms, etc. Is it okay to have the same exact site build on a different domain with rewritten copy (less a few pages) to target the same base keywords with only a different location? The site is intended for a different user group in City B, but I'm concerned the search engines won't like this and trigger the filters. I've read a bunch of duplicate content articles including this post panda by Dr. Pete. Great post, but doesn't really answer this particular issue of duplicating code for a related site. Can anyone make a case for or against this? Thanks in advance!
Technical SEO | | cmosnod0 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0