Why Aren't My Images Being Indexed?
-
Hi,
One of my clients submitted an image sitemap with 465 images. It was submitted on July 20 2017 to Google Search Console.
None of the submitted images have been indexed.
I'm wondering why?
Here's the image sitemap: http://www.tagible.com/images_sitemap.xml We do use a CDN for the images, and the images are hosted on a subdomain of the client's site: ex. https://photos.tagible.com/images/Les_Invalides_Court_Of_Honor.jpg
Thanks in advance!
Cheers,
Julian -
Thanks David! That definitely makes sense. We claimed photos.tagible.com in GSC, so hopefully that does it.
And yes, they are, but in an unusual way: http://tagible.com/project/denver-colorado/
-
Thanks Donna! I could see the 403 errors being an issue, as well as the robots.txt file not including the sitemap. I hadn't thought of that.
We're working on making sure the https issue is fixed.
-
Hi Julian,
The reason your GSC account isn't reporting your images as indexed is that they are on a different subdomain to your GSC account - GSC will only report indexed URLs that are on the exact subdomain of that account.
And are the images actually used on the site? None of them showed up in a Screaming Frog crawl...
Cheers,
David
-
It might be a permissions problem.
You have said the sitemap is here - http://www.tagible.com/images_sitemap.xml, which it is. But the robots.txt file (http://www.tagible.com/robots.txt) does not include that sitemap. It has 10 others, but not that one.
If one goes to the subdomain (https://photos.tagible.com/) or folder (https://photos.tagible.com/images/) where the images are hosted, there is a 403 (forbidden) return code. Crawlers may not be able to navigate to the folder with the images. The images themselves are accessible with a 200 return code, but not the subdomain or folder where they are stored.
I don't know if you're aware of it, but tagible.com, www.tagible.com, and photos.tagible.com are not redirecting to their https equivalents.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do when all products are one of a kind WYSIWYG and url's are continuously changing. Lots of 404's
Hey Guys, I'm working on a website with WYSIWYG one of a kind products and the url's are continuously changing. There are allot of duplicate page titles (56 currently) but that number is always changing too. Let me give you guys a little background on the website. The site sells different types of live coral. So there may be anywhere from 20 - 150 corals of the same species. Each coral is a unique size, color etc. When the coral gets sold the site owner trashes the product creating a new 404. Sometimes the url gets indexed, other times they don't since the corals get sold within hours/days. I was thinking of optimizing each product with a keyword and re-using the url by having the client update the picture and price but that still leaves allot more products than keywords. Here is an example of the corals with the same title http://austinaquafarms.com/product-category/acans/ Thanks for the help guys. I'm not really sure what to do.
Intermediate & Advanced SEO | | aronwp0 -
When Is It Necessary to GEOTAG an Image?
In local SEO practices, is it best to geotag all images or only specific ones? For example, if we have images of our retail store on our G+ page (or on our About Us page) it seems like common sense to geotag those images. However, if you're a local photographer do you want to geotag all of your images or only images shot in locations where you'd like to rank?
Intermediate & Advanced SEO | | AWCthreads0 -
I have two sitemaps which partly duplicate - one is blocked by robots.txt but can't figure out why!
Hi, I've just found two sitemaps - one of them is .php and represents part of the site structure on the website. The second is a .txt file which lists every page on the website. The .txt file is blocked via robots exclusion protocol (which doesn't appear to be very logical as it's the only full sitemap). Any ideas why a developer might have done that?
Intermediate & Advanced SEO | | McTaggart0 -
To index or de-index internal search results pages?
Hi there. My client uses a CMS/E-Commerce platform that is automatically set up to index every single internal search results page on search engines. This was supposedly built as an "SEO Friendly" feature in the sense that it creates hundreds of new indexed pages to send to search engines that reflect various terminology used by existing visitors of the site. In many cases, these pages have proven to outperform our optimized static pages, but there are multiple issues with them: The CMS does not allow us to add any static content to these pages, including titles, headers, metas, or copy on the page The query typed in by the site visitor always becomes part of the Title tag / Meta description on Google. If the customer's internal search query contains any less than ideal terminology that we wouldn't want other users to see, their phrasing is out there for the whole world to see, causing lots and lots of ugly terminology floating around on Google that we can't affect. I am scared to do a blanket de-indexation of all /search/ results pages because we would lose the majority of our rankings and traffic in the short term, while trying to improve the ranks of our optimized static pages. The ideal is to really move up our static pages in Google's index, and when their performance is strong enough, to de-index all of the internal search results pages - but for some reason Google keeps choosing the internal search results page as the "better" page to rank for our targeted keywords. Can anyone advise? Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | FPD_NYC0 -
Why my own page is not indexed for that keyword?
hi, I recently recreated the page www.zenucchi.it /ITA/poltrona-frau-brescia.html on the third level domain poltronafraubrescia.zenucchi.it by putting it on the home page. The first page is still indexed for the keyword poltrona frau brescia . But the new page is no indexed for that keyword and i don't know why ( even if the page is indexed in google ) .. I state that the new domain has the same autorithy and that i put a 301 redirect to pass his authority to the new one that has many more incoming links that did not have previous .. i hope you'll help me thanks a lot
Intermediate & Advanced SEO | | guidoboem0 -
Image ALT Descriptions
Due to the way our system is and the way we want to do something. We have to make the description for each image in the ALT. Now this is not just a few words but is actually a few sentences. Is there going to be any negative disadvantage to doing it this way? The positives I see is that it will help with accessibility and atleast the bots will be able to tell what the item is about. The negatives is that maybe this description could be better used elsewhere?
Intermediate & Advanced SEO | | websitesaleslab0 -
My homepage doesn't rank anymore. It's been replaced by irrelevant subpages which rank around 100-200 instead of top 5.
Hey guys, I think I got some kind of penalty for my homepage. I was in top5 for my keywords. Then a few days ago, my homepage stopped ranking for anything except searching for my domain name in Google. sitename.com/widget-reviews/ previously ranked #3 for "widget reviews"
Intermediate & Advanced SEO | | wearetribe
but now....
sitename.com/widget-training-for-pet-cats/ is ranking #84 for widget reviews instead. Similarly across all my other keywords, irrelevant, wrong pages are ranking. Did I get some kind of penalty?0 -
Most Painless way of getting Duff Pages out of SE's Index
Hi, I've had a few issues that have been caused by our developers on our website. Basically we have a pretty complex method of automatically generating URL's and web pages on our website, and they have stuffed up the URL's at some point and managed to get 10's of thousands of duff URL's and pages indexed by the search engines. I've now got to get these pages out of the SE's indexes as painlessly as possible as I think they are causing a Panda penalty. All these URL's have an addition directory level in them called "home" which should not be there, so I have: www.mysite.com/home/page123 instead of the correct URL www.mysite.com/page123 All these are totally duff URL's with no links going to them, so I'm gaining nothing by 301 redirects, so I was wondering if there was a more painless less risky way of getting them all out the indexes (IE after the stuff up by our developers in the first place I'm wary of letting them loose on 301 redirects incase they cause another issue!) Thanks
Intermediate & Advanced SEO | | James770