Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Images on sub domain fed from CDN
-
I have a client that uses a CDN to fill images, from a sub domain ( images.domain.com). We've made sure that the sub domain itself is not blocked. We've added a robots.txt file, we're creating an image sitemap file & we've verified ownership of the domain within GWT.
Yet, any crawler that I use only see's the first page of the sub domain (which is .html) but none of the subsequent URL's which are all .jpeg.
Is there something simple I'm missing here?
-
Alphonse it sounded like they were just waiting for the sitemap to launch. Other than that, I couldn't think of anything else to add because the sitemap should solve their issue. However, I have marked this as "Discussion" again.
-
I am a little confused. The question was marked answered, but which one is the answer?
-
We have the same issue however we have image XML sitemaps on each country subdomain's XML Index which point to the image files on images.domain.com.
Example:
https://uk.domain.com/image-sitemap1.xml
https://us.domain.com/image-sitemap1.xml
These 2 files are the same.
We also don't have a homepage on images.domain.com and it currently responds with a 404.
Do you think we need to create a landing page on the homepage and host the image XML sitemap at https://images.domain.com/images-sitemap1.xml rather than in each sub-domain?
Thanks.
-
Yes, we are doing everything correctly, aside from waiting for IT department to create a sitemap.
-
Are you using your own subdomain or one somewhere else (e.g. akamai.com)? You should use your own subdomain, if possible.
Was this a change from a previous version that didn't use a CDN? If those images were/are hosted on your primary domain be sure to match the filenames and paths as closely as possible to what they were before.
If you're doing that you shouldn't have a problem once the sitemap is submitted.
For more information please check out this post:
http://www.goinflow.com/four-seo-best-practices-for-using-a-content-delivery-network-cdn/How do you know that Google only attempts to crawl the primary domain URL (i.e. the .html page)? Are you checking log files?
Is the crawler you're using set to crawl external URLs? If not, that could be the issue. Technically a subdomain is a totally separate website so most tools don't crawl them by default.
-
We've correctly applied the CNAME directive from the CDN to reflect the subdomain. Yet, when Google or any other tool attempts to crawl it only shows ONE URL. Not the images that are residing on their own independent URL's.
-
In order to put those image URLs for the crawler to be able to access them you should either:
- Link to the URLs of the images (does that .html page in the subdomain contain these URLs?)
or
- Use the images URLs as resources in the pages already been crawled. Unfortunately this could be tricky when dealing with CDNs since those resources are dynamic.
In either case, the sitemap will solve your problem.
-
The sitemap is not completed yet. Server logs show Googlebot only indexing one page the .html page, not other pages.
-
Did you reference the sitemap in the robots.txt file or did you set up it in GWT?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain Masking SEO Impact
I hope I am explaining this correctly. If I need to provide any clarity please feel free to ask. We currently use a domain mask on an external platform that points back to our site. We are a non-profit and the external site allows users to create peer-to peer fundraisers that benefit our ministry. Currently we get many meta issues related to this site as well as broken links when fundraisers expire etc. We do not have a need to rank for the information from this site. Is there a way to index these pages so that they are not a part of the search engine site crawls as it relates to our site?
Technical SEO | | SamaritansPurse0 -
Size of image for article Schema
Hi, I implemented schema markup for an article and all tested fine and I can see it being fired in preview mode of Google Tag Manager. But when I run the URL which has it applied through Google Structured Testing tool it is not appearing. I have now read that the image needs to be a certain size. For AMP articles this appears to be 12oo pixels wide http://www.thesempost.com/google-changes-image-size-requirements-amp-articles/ But what about non-AMP articles? Does it need to be that big too?
Technical SEO | | AL123al0 -
URL Structure On Site - Currently it's domain/product-name NOT domain/category/product name is this bad?
I have a eCommerce site and the site structure is domain/product-name rather than domain/product-category/product-name Do you think this will have a negative impact SEO Wise? I have seen that some of my individual product pages do get better rankings than my categories.
Technical SEO | | the-gate-films0 -
Block Domain in robots.txt
Hi. We had some URLs that were indexed in Google from a www1-subdomain. We have now disabled the URLs (returning a 404 - for other reasons we cannot do a redirect from www1 to www) and blocked via robots.txt. But the amount of indexed pages keeps increasing (for 2 weeks now). Unfortunately, I cannot install Webmaster Tools for this subdomain to tell Google to back off... Any ideas why this could be and whether it's normal? I can send you more domain infos by personal message if you want to have a look at it.
Technical SEO | | zeepartner0 -
Image Size for SEO
Hi there I have a website which has some png images on pages, around 300kb - is this too much? How many kbs a page, to what extent do you know does Google care about page load speed? is every kb important, is there a limit? Any advice much appreciated.
Technical SEO | | pauledwards0 -
Using hyphenated sub-domains or non-hyphenated sub-domains? What is the question! I Any takers?
For our corporate business level domain, we are exploring using a hyphenated sub-domain foir a project. Something like www.go-figure.extreme.com I thought from a user perspective it seems cluttered. The domain length might also be an issue with the new Algorithm big G has launched in recent past. I know with past experience, hyphenated domains usually take longer to index, as they are used by spammers more frequently and can take longer to get out of the supplementary index. Our company site has over 90 million viewers / year, so our brand is well established and traffic isn't an issue. This is for a corporate level project and I didn't have the answer! Will this work? anyone have any experience testing this. Any thoughts will help! Thanks, Rob
Technical SEO | | RobMay0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing? These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0 -
301 Redirect vs Domain Alias
We have hundreds of domains which are either alternate spelling of our primary domain or close keyword names we didn't want our competitor to get before us. The primary domain is running on a dedicated Windows server running IIS6 and set to a static IP. Since it is a static IP and not using host headers any domain pointed to the static IP will immediately show the contents of the site, however the domain will be whatever was typed. Which could be the primary domain or an alias. Two concerns. First, is it possible that Google would penalize us for the alias domains or dilute our primary domain "juice"? Second, we need to properly track traffic from the alias domains. We could make unique content for those performing well and sell or let expire those that are sending no traffic. It's not my goal to use the alias domains to artificially pump up our primary domain. We have them for spelling errors and direct traffic. What is the best practice for handling one or both of these issues?
Technical SEO | | briankb0