Can we validate a CDN like Max in Webmasters?
-
Hi, Can we validate a CDN like Max in Webmasters?
We have images hosted in CDN and they dont get indexed in Google images. Its been a year now and no luck. Maxcdn says they have no issues at there end and images have ALT and they are original images with no copyright issues
-
You can't get their domain validated in your Google Search Console account (unless they add you as a user to their account), however there is a workaround more or less described here: Getting CDN images indexed with Google
In short:
If your image is for example on https://12345.maxcdn.com/images/directory/subdirectory/image-filename.jpgYou can add a CNAME (alternate domain name) to make the URLs formatted like this: http://images.yourdomain.com/images/directory/subdirectory/image-filename.jpg
After that you can add "images.yourdomain.com" as a verified domain in Google Search Console and treat it as a new subdomain, incl. sitemaps etc.
Hope this helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website ranking declined after connecting to CDN
Hi! We are trying to rank https://windowmart.ca for various local search terms. Our head office is in Edmonton where we try to rank https://windowmart.ca/edmonton-windows-doors/ for such terms as "windows Edmonton", "replacement windows Edmonton", "windows and doors Edmonton" as well as others. The website was the leader in its niche for around 2 years. Then we've got some server related issues, moved to a new server and connected CDN Nitropack that really improved our google speed test results. Recently we noticed that our rankings started to drop. Do you know if Nitropack can negatively effect local SEO rankings? Thank you!
Technical SEO | | vaskrupp0 -
Can you confirm legitimate Google Bot traffic?
We use Cloudflare as a firewall. I noticed a significant number of blocks of bot traffic. One of the things they do is try to block bad bot traffic. But it seems they are mistakenly blocking Google Bot traffic. If you use Cloudflare, you may want to look into this as well. Also, can you confirm if the following IPs are for legitimate Google Bots? 66.249.79.88
Technical SEO | | akin67
66.249.79.65
66.249.79.80 66.249.79.76 Thanks,1 -
Can I redirect a link even if the link is still on the site
Hi Folks, I've got a client who has a duplicate content because they actually create duplicate content and store the same piece of content in 2 different places. When they generate this duplicate content, it creates a 2nd link on the site going to the duplicate content. Now they want the 2nd link to always redirect to the first link, but for architecture reasons, they can't remove the 2nd link from the site navigation. We can't use rel-canonical because they don't want visitors going to that 2nd page. Here is my question: Are there any adverse SEO implications to maintaining a link on a site that always redirects to a different page? I've already gone down the road of "don't deliberately create duplicate content" with the client. They've heard me, but won't change. So, what are your thoughts? Thanks!
Technical SEO | | Rock330 -
Webmaster Crawl errors caused by Joomla menu structure.
Webmaster Tools is reporting crawl errors for pages that do not exist due to how my Joomla menu system works. Example, I have a menu item named "Service Area" that stores 3 sub items but no actual page for Service Area. This results in a URL like domainDOTcom/service-area/service-page.html Because the Service Area menu item is constructed in a way that shows the bot it is a link, I am getting a 404 error saying it can't find domainDOTcom/service-area/ (The link is to "javasript:;") Note, the error doesn't say domainDOTcom/service-area/javascript:; it just says /service-area/ What is the best way to handle this? Can I do something in robots.txt to tell the bot that this /service-area/ should be ignored but any page after /service-area/ is good to go? Should I just mark them as fixed as it's really not a 404 a human will encounter or is it best to somehow explain this to the bot? I was advised on google forums to try this, but I'm nervous about it. Disallow: /service-area/*
Technical SEO | | dwallner
Allow: /service-area/summerlin-pool-service.
Allow: /service-area/north-las-vegas
Allow: /service-area/centennial-hills-pool-service I tried a 301 redirect of /service-area to home page but then it pulls that out of the url and my landing pages become 404's. http://www.lvpoolcleaners.com/ Thanks for any advice! Derrick0 -
Weird Cigarette URLs showing up in Google Webmaster Tools
Hi there, I'm noticing a bunch of URLs showing up in my google webmaster tools that are all cigarette related (they are appearing as 404s in the crawl error report). They are throwing 404 errors which is why they are listed here... Anyone have any idea of what this could be? I recently switched from Wordpress to Shopify and these weird URLs just started appearing on my webmaster tools in the last week. Kinda bizarre / a little alarming! Thanks,
Technical SEO | | TheBatesMillStore
Bianca0 -
Does it sound like a linkwheel to you?
I have a small group of insurance related blogs and sites (20). I have been creating useful articles and linked them to my main site. These sites are "not" interlinked and they are all in different C-classes ip. I was thinking in linking some of them to some other good resources that I have in article sites like ezinearticles.com with the idea of making my articles available to my readers and also increasing the rate that they get crawl by the bots. Maybe even boosting does pages Page Authority. Thanks, Gabe
Technical SEO | | ggplaylist0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0 -
Google webmasters shows 37K not found errors
Hello we are using Joomla as our cms, months ago we used a component to create friendly urls, lots of them got indexed by google, testing the component we created three different types of URL, the problem now is that all of this tests are showing in google webmasters as 404 errors, 37,309 not found pages and this number is increasing everyday. What do you suggest to fix this?? Regards.
Technical SEO | | Zertuxte0