Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Images on sub domain fed from CDN
-
I have a client that uses a CDN to fill images, from a sub domain ( images.domain.com). We've made sure that the sub domain itself is not blocked. We've added a robots.txt file, we're creating an image sitemap file & we've verified ownership of the domain within GWT.
Yet, any crawler that I use only see's the first page of the sub domain (which is .html) but none of the subsequent URL's which are all .jpeg.
Is there something simple I'm missing here?
-
Alphonse it sounded like they were just waiting for the sitemap to launch. Other than that, I couldn't think of anything else to add because the sitemap should solve their issue. However, I have marked this as "Discussion" again.
-
I am a little confused. The question was marked answered, but which one is the answer?
-
We have the same issue however we have image XML sitemaps on each country subdomain's XML Index which point to the image files on images.domain.com.
Example:
https://uk.domain.com/image-sitemap1.xml
https://us.domain.com/image-sitemap1.xml
These 2 files are the same.
We also don't have a homepage on images.domain.com and it currently responds with a 404.
Do you think we need to create a landing page on the homepage and host the image XML sitemap at https://images.domain.com/images-sitemap1.xml rather than in each sub-domain?
Thanks.
-
Yes, we are doing everything correctly, aside from waiting for IT department to create a sitemap.
-
Are you using your own subdomain or one somewhere else (e.g. akamai.com)? You should use your own subdomain, if possible.
Was this a change from a previous version that didn't use a CDN? If those images were/are hosted on your primary domain be sure to match the filenames and paths as closely as possible to what they were before.
If you're doing that you shouldn't have a problem once the sitemap is submitted.
For more information please check out this post:
http://www.goinflow.com/four-seo-best-practices-for-using-a-content-delivery-network-cdn/How do you know that Google only attempts to crawl the primary domain URL (i.e. the .html page)? Are you checking log files?
Is the crawler you're using set to crawl external URLs? If not, that could be the issue. Technically a subdomain is a totally separate website so most tools don't crawl them by default.
-
We've correctly applied the CNAME directive from the CDN to reflect the subdomain. Yet, when Google or any other tool attempts to crawl it only shows ONE URL. Not the images that are residing on their own independent URL's.
-
In order to put those image URLs for the crawler to be able to access them you should either:
- Link to the URLs of the images (does that .html page in the subdomain contain these URLs?)
or
- Use the images URLs as resources in the pages already been crawled. Unfortunately this could be tricky when dealing with CDNs since those resources are dynamic.
In either case, the sitemap will solve your problem.
-
The sitemap is not completed yet. Server logs show Googlebot only indexing one page the .html page, not other pages.
-
Did you reference the sitemap in the robots.txt file or did you set up it in GWT?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
English and French under the same domain
A friend of mine runs a B&B and asked me to check his freshly built website to see if it was <acronym title="Search Engine Optimization">SEO</acronym> compliant.
Technical SEO | | coolhandluc
The B&B is based in France and he's targeting a UK and French audience. To do so, he built content in english and french under the same domain:
https://www.la-besace.fr/ When I run a crawl through screamingfrog only the French content based URLs seem to come up and I am not sure why. Can anyone enlighten me please? To maximise his business local visibility my recommendation would be to build two different websites (1 FR and 1 .co.uk) , build content in the respective language version sites and do all the link building work in respective country sites. Do you think this is the best approach or should he stick with his current solution? Many thanks1 -
Umbrella company and multiple domains
I'm really sorry for asking this question yet again. I have searched through previous answers but couldn't see something exactly like this I think. There is a website called example .com. It is a sort of umbrella company for 4 other separate domains within it - 4 separate companies. The Home page of the "umbrella" company website is example.com. It is just an image with no content except navigation on it to direct to the 4 company websites. The other pages of website example.com are the 4 separate companies domains. So on the navigation bar there is : Home page = example.com company1page = company1domain.com company2page= company2domain.com etc. etc. Clicking "home" will take you back to example.com (which is just an image). How bad or good is this structure for SEO? Would you recommend any changes to help them rank better? The "home" page has no authority or links, and neither do 3 out of the 4 other domains. The 4 companies websites are independent in content (although theme is the same). What's bringing them altogether is under this umbrella website - example.com. Thank you
Technical SEO | | AL123al0 -
How to increase your Domain Authority
Hi Guys, Can someone please provide some pointers on how to best increase your Domain Authority?? Thanks Gareth
Technical SEO | | GAZ090 -
Any way around buying hosting for an old domain to 301 redirect to a new domain?
Howdy. I have just read this QA thread, so I think I have my answer. But I'm going to ask anyway! Basically DomainA.com is being retired, and DomainB.com is going to be launched. We're going to have to redirect numerous URLs from DomainA.com to DomainB.com. I think the way to go about this is to continue paying for hosting for DomainA.com, serving a .htaccess from that hosting account, and then hosting DomainB.com separately. Anybody know of a way to avoid paying for hosting a .htaccess file on DomainA.com? Thanks!
Technical SEO | | SamTurri0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0 -
Delete old site but redirect domain to a new domain and site
I just have a quick query and I have a feeling about what the answer is so just wanted to see what you guys thought... Basically I am working on a client site. This client has a few other websites that are divisions of their company. However these divisions/websites are no longer used. They are wanting to delete the websites but redirect the domains to their name main website. They believe this will pass on SEO benefits as these old division sites are old and have a good PR and history. I'm unsure for DEFINITE, which way is correct?
Technical SEO | | Weerdboil0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0 -
Outranking a competitor when their domain name is the keyword
Hi I'd just like to ask the opinion of my fellow members here : We are currently ranking second for a very important keyword and would obviously like the top spot on the SERP - the site that is ranking first has the domain name as the keyword phrase(along with a good amount of quality links from a variety of domains) - now I know it is possible to outrank them since I do remember reading about this in one of Rands posts(I think it was the whole white hat black hat one he posted recently) - bascially we have more domain authority, slightly less links but from double the amount of root domains and a higher page authority too! Does having the keyword as your domain make THAT much of a difference when we are(imo) quite close in terms of great content and link profiles(and all the onpage factors) ? Thanks!
Technical SEO | | DanHill0