Google Indexing of Images
-
Our site is experiencing an issue with indexation of images. The site is real estate oriented. It has 238 listings with about 1190 images. The site submits two version (different sizes) of each image to Google, so there are about 2,400 images. Only several hundred are indexed.
Can adding Microdata improve the indexation of the images?
Our site map is submitting images that are on no-index listing pages to Google. As a result more than 2000 images have been submitted but only a few hundred have been indexed. How should the site map deal with images that reside on no-index pages? Do images that are part of pages that are set up as "no-index" need a special "no-index" label or special treatment?
My concern is that so many images that not indexed could be a red flag showing poor quality content to Google.
Is it worth investing in correcting this issue, or will correcting it result in little to no improvement in SEO?
Thanks, Alan
-
I am chiming in a year late but there is just one thing I am not sure I understand. Why would you want to index images on no-index pages? What are these pages that you want to be no-indexed in the first place? If you do not want these pages to be found when searching in Google, why would you want some of the content, like images, be found instead?
I am with Michael and recommend that you fix the sitemap. I am also curious to know what has happened in the past year. Have your issues resolved? Have your SEO improved?
-
I would definitely update that sitemap. If your sitemap is telling Google one thing, and the pages themselves are contradicting the sitemap, AND it's happening thousands of times--that's a negative quality signal to Google, and could affect all sorts of things, from crawl budget to indexation to rankings.
ALT tags are worth fixing as well. That's really the #1 clue Google has to what the images are about. (Other clues: the image filename, and the page title, if it's the main image on the page). Here, I'm presuming that the images are ones you hope to have show up in image search results (otherwise why would you bother creating an image sitemap?)...in which case, you really, REALLY need to put the ALT text on them.
-
Apparently our site map submits images to Google even when they are on pages that are marked as no index.
The result is that only about 250 out of 2250 images are actually indexed by Google. Apparently Google (as you suggested) is not indexing images that are on pages that are marked "no-index".
Do you think it makes sense for my developers to modify the site map so it no longer submits images that are on pages that are marked as no-index? Is it worth investing resources in fixing this? If this is not going to cause SEO problems I would just as well leave it alone.
Also, the way images are set up, we do not have the ability to customize alt tags. Is this worth fixing? Could repairing these issues with images improve overall ranking?
Thanks, Alan
-
I've not seen instances where Google would index an image that's on a page that's marked noindex.
Be sure that you have consistency between your sitemap and your noindex/index tags on the pages, i.e. don't include a page or image in your sitemap where the page itself (or containing page) indicates noindex.
If you look at how Webmaster Tools OOPS I guess I mean "Search Console" (will Google EVER let a product keep the same name forever???) shows indexation of images in a image sitemap, you'll notice they pair the image indexation count with the web page indexation count. I take that as an indication that they're not interested in indexing images on noindexed pages (which I have to say makes sense to me).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long will old pages stay in Google's cache index. We have a new site that is two months old but we are seeing old pages even though we used 301 redirects.
Two months ago we launched a new website (same domain) and implemented 301 re-directs for all of the pages. Two months later we are still seeing old pages in Google's cache index. So how long should I tell the client this should take for them all to be removed in search?
Intermediate & Advanced SEO | | Liamis0 -
Google Indexed Site A's Content On Site B, Site C etc
Hi All, I have an issue where the content (pages and images) of Site A (www.ericreynolds.photography) are showing up in Google under different domains Site B (www.fastphonerepair.com), Site C (www.quarryhillvet.com), Site D (www.spacasey.com). I believe this happened because I installed an SSL cert on Site A but didn't have the default SSL domain set on the server. You were able to access Site B and any page from Site A and it would pull up properly. I have since fixed that SSL issue and am now doing a 301 redirect from Sites B, C and D to Site A for anything https since Sites B, C, D are not using an SSL cert. My question is, how can I trigger google to re-index all of the sites to remove the wrong listings in the index. I have a screen shot attached so you can see the issue clearer. I have resubmitted my site map but I'm not seeing much of a change in the index for my site. Any help on what I could do would be great. Thanks
Intermediate & Advanced SEO | | cwscontent
Eric TeVM49b.png qPtXvME.png1 -
When does Google index a fetched page?
I have seen where it will index on of my pages within 5 minutes of fetching, but have also read that it can take a day. I'm on day #2 and it appears that it has still not re-indexed 15 pages that I fetched. I changed the meta-description in all of them, and added content to nearly all of them, but none of those changes are showing when I do a site:www.site/page I'm trying to test changes in this manner, so it is important for me to know WHEN a fetched page has been indexed, or at least IF it has. How can I tell what is going on?
Intermediate & Advanced SEO | | friendoffood0 -
Huge google index with un-relevant pages
Hi, i run a site about sport matches, every match has a page and the pages are generated automatically from the DB. pages are not duplicated, but over time some look a little bit similar. after a match finishes it has no internal links or sitemap entry, but it's reachable by direct URL and continues to be on google index. so over time we have more than 100,000 indexed pages. since past matches have no significance and they're not linked and a match can repeat and it may look like duplicate content....what you suggest us to do: when a match is finished - not linked, but appears on the index and SERP 301 redirect the match Page to the match Category which is a higher hierarchy and is always relevant? use rel=canonical to the match Category do nothing.... *301 redirect will shrink my index status, some say a high index status is good... *is it safe to 301 redirect 100,000 pages at once - wouldn't it look strange to google? *would canonical remove the past matches pages from the index? what do you think? Thanks, Assaf.
Intermediate & Advanced SEO | | stassaf0 -
Panda Recovery - What is the best way to shrink your index and make Google aware?
We have been hit significantly with Panda and assume that our large index with some pages holding thin/duplicate content being the reason. We have reduced our index size by 95% and have done significant content development on the remaining 5% pages. For the old, removed pages, we have installed 410 responses (Page does not exist any longer) and made sure that they are removed from the sitempa submitted to Google; however after over a month we still see Google spider returning to the same pages and the webmaster tools shows no indicator that Google is shrinking our index size. Are there more effective and automated ways to make Google aware of a smaller index size in hope of Panda recovery? Potentially using the robots.txt file, GWT URL removal tool etc? Thanks /sp80
Intermediate & Advanced SEO | | sp800 -
How Long Does it Take for Rel Canonical to De-Index / Re-Index a Page?
Hi Mozzers, We have 2 e-commerce websites, Website A and Website B, sharing thousands of pages with duplicate product descriptions. Currently only the product pages on Website B are indexing, and we want Website A indexed instead. We added the rel canonical tag on each of Website B's product pages with a link towards the matching product on Page A. How long until Website B gets de-indexed and Website A gets indexed instead? Did we add the rel canonical tag correctly? Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Google Places
If you rank on google places, I have noticed that you do not rank on the front page as well. I have a site that ranks on front page for it's keywords; however, because they are (1) on google places, they don't show up when someone is localized to that area. They show up on google places but not on front page. If you turn of localization, they are first in serps. How can I get around this? Two separate sites? One for Google+ (Places) and one for SERPS?
Intermediate & Advanced SEO | | JML11790 -
Have we suffered a Google penalty?
Hello, In January, we started a new blog to supplement our core ecommerce website. The URL of the website is www.footballshirtblog.co.uk and the idea behind it was that we would write articles related to our industry to build a community which would ultimately boost our sales. We would add several posts per day, a mix between shorter news stories of around 150 words and more detailed content pages of around 500 words. Everything was going well, we were making slow but sure progress on the main generic keywords but were receiving several thousand visitors a day, mostly finding the posts themselves on Google. The surge on traffic meant we needed to move server, which we did around 6 weeks ago. When we did this, we had a few teething problems with file permissions, etc, which meant we were tempoarily able to add new posts. As our developers were tied up with other issues, this continued for a 7-10 day period, with no new content being added. In this period, the site completely dropped from Google, losing all it's rankings and traffic, to the extent it now doesn't even rank for it's own name. This is very frustrating as we have put a huge amount of work and content into developing this site. We have added a few posts since, but not a huge amount as it is frustrating to do it with no return and the concern that the site has been banned forever. I cannot think of any logical reason why this penalty has occured as we haven't been link spamming, etc. Does anyone have any feedback or suggestions as to how we can get back on track? Regards,
Intermediate & Advanced SEO | | ukss1984
David0