Google Indexing of Images
-
Our site is experiencing an issue with indexation of images. The site is real estate oriented. It has 238 listings with about 1190 images. The site submits two version (different sizes) of each image to Google, so there are about 2,400 images. Only several hundred are indexed.
Can adding Microdata improve the indexation of the images?
Our site map is submitting images that are on no-index listing pages to Google. As a result more than 2000 images have been submitted but only a few hundred have been indexed. How should the site map deal with images that reside on no-index pages? Do images that are part of pages that are set up as "no-index" need a special "no-index" label or special treatment?
My concern is that so many images that not indexed could be a red flag showing poor quality content to Google.
Is it worth investing in correcting this issue, or will correcting it result in little to no improvement in SEO?
Thanks, Alan
-
I am chiming in a year late but there is just one thing I am not sure I understand. Why would you want to index images on no-index pages? What are these pages that you want to be no-indexed in the first place? If you do not want these pages to be found when searching in Google, why would you want some of the content, like images, be found instead?
I am with Michael and recommend that you fix the sitemap. I am also curious to know what has happened in the past year. Have your issues resolved? Have your SEO improved?
-
I would definitely update that sitemap. If your sitemap is telling Google one thing, and the pages themselves are contradicting the sitemap, AND it's happening thousands of times--that's a negative quality signal to Google, and could affect all sorts of things, from crawl budget to indexation to rankings.
ALT tags are worth fixing as well. That's really the #1 clue Google has to what the images are about. (Other clues: the image filename, and the page title, if it's the main image on the page). Here, I'm presuming that the images are ones you hope to have show up in image search results (otherwise why would you bother creating an image sitemap?)...in which case, you really, REALLY need to put the ALT text on them.
-
Apparently our site map submits images to Google even when they are on pages that are marked as no index.
The result is that only about 250 out of 2250 images are actually indexed by Google. Apparently Google (as you suggested) is not indexing images that are on pages that are marked "no-index".
Do you think it makes sense for my developers to modify the site map so it no longer submits images that are on pages that are marked as no-index? Is it worth investing resources in fixing this? If this is not going to cause SEO problems I would just as well leave it alone.
Also, the way images are set up, we do not have the ability to customize alt tags. Is this worth fixing? Could repairing these issues with images improve overall ranking?
Thanks, Alan
-
I've not seen instances where Google would index an image that's on a page that's marked noindex.
Be sure that you have consistency between your sitemap and your noindex/index tags on the pages, i.e. don't include a page or image in your sitemap where the page itself (or containing page) indicates noindex.
If you look at how Webmaster Tools OOPS I guess I mean "Search Console" (will Google EVER let a product keep the same name forever???) shows indexation of images in a image sitemap, you'll notice they pair the image indexation count with the web page indexation count. I take that as an indication that they're not interested in indexing images on noindexed pages (which I have to say makes sense to me).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How do internal search results get indexed by Google?
Hi all, Most of the URLs that are created by using the internal search function of a website/web shop shouldn't be indexed since they create duplicate content or waste crawl budget. The standard way to go is to 'noindex, follow' these pages or sometimes to use robots.txt to disallow crawling of these pages. The first question I have is how these pages actually would get indexed in the first place if you wouldn't use one of the options above. Crawlers follow links to index a website's pages. If a random visitor comes to your site and uses the search function, this creates a URL. There are no links leading to this URL, it is not in a sitemap, it can't be found through navigating on the website,... so how can search engines index these URLs that were generated by using an internal search function? Second question: let's say somebody embeds a link on his website pointing to a URL from your website that was created by an internal search. Now let's assume you used robots.txt to make sure these URLs weren't indexed. This means Google won't even crawl those pages. Is it possible then that the link that was used on another website will show an empty page after a while, since Google doesn't even crawl this page? Thanks for your thoughts guys.
Intermediate & Advanced SEO | | Mat_C0 -
I'm noticing that URL that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before?
I'm noticing that URLs that were once indexed by Google are suddenly getting dropped without any error messages in Webmasters Tools, has anyone seen issues like this before? Here's an example:
Intermediate & Advanced SEO | | nystromandy
http://www.thefader.com/2017/01/11/the-carter-documentary-lil-wayne-black-lives-matter0 -
How do we decide which pages to index/de-index? Help for a 250k page site
At Siftery (siftery.com) we have about 250k pages, most of them reflected in our sitemap. Though after submitting a sitemap we started seeing an increase in the number of pages Google indexed, in the past few weeks progress has slowed to a crawl at about 80k pages, and in fact has been coming down very marginally. Due to the nature of the site, a lot of the pages on the site likely look very similar to search engines. We've also broken down our sitemap into an index, so we know that most of the indexation problems are coming from a particular type of page (company profiles). Given these facts below, what do you recommend we do? Should we de-index all of the pages that are not being picked up by the Google index (and are therefore likely seen as low quality)? There seems to be a school of thought that de-indexing "thin" pages improves the ranking potential of the indexed pages. We have plans for enriching and differentiating the pages that are being picked up as thin (Moz itself picks them up as 'duplicate' pages even though they're not. Thanks for sharing your thoughts and experiences!
Intermediate & Advanced SEO | | ggiaco-siftery0 -
Google Indexing our site
We have 700 city pages on our site. We submitted to google via a https://www.samhillbands.com/sitemaps/locations.xml but they only indexed 15 so far. Yes the content is similar on all of the pages...thought on getting them to index the remaining pages?
Intermediate & Advanced SEO | | brianvest0 -
Client has moved to secured https webpages but non secured http pages are still being indexed in Google. Is this an issue
We are currently working with a client that relaunched their website two months ago to have hypertext transfer protocol secure pages (https) across their entire site architecture. The problem is that their non secure (http) pages are still accessible and being indexed in Google. Here are our concerns: 1. Are co-existing non secure and secure webpages (http and https) considered duplicate content?
Intermediate & Advanced SEO | | VanguardCommunications
2. If these pages are duplicate content should we use 301 redirects or rel canonicals?
3. If we go with rel canonicals, is it okay for a non secure page to have rel canonical to the secure version? Thanks for the advice.0 -
Image optimization in 2013
hello post the google Image update ( http://googlewebmastercentral.blogspot.com/2013/01/faster-image-search.html ) please could you let me know what the status of image optimization is and also what the best practices are? Thank you so much. I appreciate it. Vijay
Intermediate & Advanced SEO | | vijayvasu0 -
How can I block unwanted urls being indexed on google?
Hi, I have to block unwanted urls (not that page) from being indexed on google. I have to block urls like example.com/entertainment not the exact page example.com/entertainment.aspx . Is there any other ways other than robot.txt? If i add this to robot.txt will that block my other url too? Or should I make a 301 redirection from example.com/entertainment to example.com/entertainment.aspx. Because some of the unwanted urls are linked from other sites. thanks in advance.
Intermediate & Advanced SEO | | VipinLouka780