Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Google doesn't index image slideshow
-
Hi,
My articles are indexed and images (full size) via a meta in the body also. But, the images in the slideshow are not indexed, have you any idea? A problem with the JS
Example : http://www.parismatch.com/People/Television/Sport-a-la-tele-les-femmes-a-l-abordage-962989
Thank you in advance
Julien
-
You can do a "site:" search directly in Google like this and I currently see this --> http://screencast.com/t/ZVqq5iumQ - you can probably do a site: search on the whole domain, a subfolder or a specific page etc.
-
Ok, what is the best method that you recommend for verify images indexation directly in Google ?
I would post a message explaining the change after change sitemaps.
Thanks for all
-
Thanks! OK, yes I'd make your Sitemap and HTML image URLs the same.
Also, that's a LOT of images, so I'm not surprised Google is taking time to index them.
Also, there can sometimes be a delay in Search Console data. You can always be checking Google itself to see what files are indexed.
-
Not really, it seem be ok
-
Thanks! Hmmm did it clear Search Console without any errors? I see an error in my browser --> http://screencast.com/t/VLWhg8EyR3Dd
-
The images are here :
http://www.parismatch.com/var/exports/sitemaps/sitemap_images_parismatch-10.xml
-
Is this your current sitemap?
http://www.parismatch.com/var/exports/sitemaps/sitemap_parismatch-index.xml
What is the direct address of the image sitemap(s)?
Thanks!
-
Thanks Dan. Unfortunately, we have changed the images of host, on a different CDN...
Before the redesign, we used exactly this configuration, visible on this page (it's just an article, we don't have a slideshow example):
http://www.parismatch.com/Chroniques/Art-de-vivre/Lodge-Story-925785We have perhaps a problem with the image sitemaps because we have in Google Sitemaps:
<image: loc="">http://cdn-parismatch.ladmedia.fr/var/news/storage/images/paris-match/culture/cinema/le-fils-de-saul-la-critique-763334/8067828-1-fre-FR/Le-Fils-de-Saul-la-critique.jpg</image:>
and in the HTML source:
the perhaps should be put in the same sitempas URLs as used in HTML?
Many thanks for your help !

-
I see, thanks. Hmmm... did anything else change besides the re-design? Did the images URLs change, or did where they were being hosted change?
The current implementation doesn't show any issues, but I wonder if things were properly done in moving to the new design. Did you always have a slideshow format? Did the code change or just the design?
-
Thanks Dan !
I'm agree with you. It's problematic because since website redesign, we record a fall of images traffic by Google

-
Hi There
There does not appear to be any accessibility issues. I can crawl and access the images just fine with my crawler.
My guess is that since the images are duplicate, and they also exist on other websites, Google may be avoiding indexing them since they already are indexed and they are technically not being linked to with a normal tag.
Is this causing a particular issue for the site? Or is it just a pesky technical bug?
-
The display image is resized and indexed :
and the full size image is in META but not indexed :
-
How are your images being fed into the site? Are you using a CDN?
-Andy
-
The robots.txt file doesn't block the images, I check it. The website is under Easy Publish.
-
Hi Julien,
I always start with robots.txt in these cases, but that looks OK.
Is anything being blocked by JS? Something else to look at is if you are using something like Wordpress, there are plugins that can block access to these without you realising.
Looking at the URL of the image, this appears to be hosted on a 3rd party site?
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexed "Lorem Ipsum" content on an unfinished website
Hi guys. So I recently created a new WordPress site and started developing the homepage. I completely forgot to disallow robots to prevent Google from indexing it and the homepage of my site got quickly indexed with all the Lorem ipsum and some plagiarized content from sites of my competitors. What do I do now? I’m afraid that this might spoil my SEO strategy and devalue my site in the eyes of Google from the very beginning. Should I ask Google to remove the homepage using the removal tool in Google Webmaster Tools and ask it to recrawl the page after adding the unique content? Thank you so much for your replies.
Intermediate & Advanced SEO | | Ibis150 -
Trying to get Google to stop indexing an old site!
Howdy, I have a small dilemma. We built a new site for a client, but the old site is still ranking/indexed and we can't seem to get rid of it. We setup a 301 from the old site to the new one, as we have done many times before, but even though the old site is no longer live and the hosting package has been cancelled, the old site is still indexed. (The new site is at a completely different host.) We never had access to the old site, so we weren't able to request URL removal through GSC. Any guidance on how to get rid of the old site would be very appreciated. BTW, it's been about 60 days since we took these steps. Thanks, Kirk
Intermediate & Advanced SEO | | kbates0 -
Can't generate a sitemap with all my pages
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly. I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages? Kindly, Greg
Intermediate & Advanced SEO | | Banknotes0 -
URL Injection Hack - What to do with spammy URLs that keep appearing in Google's index?
A website was hacked (URL injection) but the malicious code has been cleaned up and removed from all pages. However, whenever we run a site:domain.com in Google, we keep finding more spammy URLs from the hack. They all lead to a 404 error page since the hack was cleaned up in the code. We have been using the Google WMT Remove URLs tool to have these spammy URLs removed from Google's index but new URLs keep appearing every day. We looked at the cache dates on these URLs and they are vary in dates but none are recent and most are from a month ago when the initial hack occurred. My question is...should we continue to check the index every day and keep submitting these URLs to be removed manually? Or since they all lead to a 404 page will Google eventually remove these spammy URLs from the index automatically? Thanks in advance Moz community for your feedback.
Intermediate & Advanced SEO | | peteboyd0 -
Best way to remove full demo (staging server) website from Google index
I've recently taken over an in-house role at a property auction company, they have a main site on the top-level domain (TLD) and 400+ agency sub domains! company.com agency1.company.com agency2.company.com... I recently found that the web development team have a demo domain per site, which is found on a subdomain of the original domain - mirroring the site. The problem is that they have all been found and indexed by Google: demo.company.com demo.agency1.company.com demo.agency2.company.com... Obviously this is a problem as it is duplicate content and so on, so my question is... what is the best way to remove the demo domain / sub domains from Google's index? We are taking action to add a noindex tag into the header (of all pages) on the individual domains but this isn't going to get it removed any time soon! Or is it? I was also going to add a robots.txt file into the root of each domain, just as a precaution! Within this file I had intended to disallow all. The final course of action (which I'm holding off in the hope someone comes up with a better solution) is to add each demo domain / sub domain into Google Webmaster and remove the URLs individually. Or would it be better to go down the canonical route?
Intermediate & Advanced SEO | | iam-sold0 -
Is there a way to get a list of Total Indexed pages from Google Webmaster Tools?
I'm doing a detailed analysis of how Google sees and indexes our website and we have found that there are 240,256 pages in the index which is way too many. It's an e-commerce site that needs some tidying up. I'm working with an SEO specialist to set up URL parameters and put information in to the robots.txt file so the excess pages aren't indexed (we shouldn't have any more than around 3,00 - 4,000 pages) but we're struggling to find a way to get a list of these 240,256 pages as it would be helpful information in deciding what to put in the robots.txt file and which URL's we should ask Google to remove. Is there a way to get a list of the URL's indexed? We can't find it in the Google Webmaster Tools.
Intermediate & Advanced SEO | | sparrowdog0 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
Disallowed Pages Still Showing Up in Google Index. What do we do?
We recently disallowed a wide variety of pages for www.udemy.com which we do not want google indexing (e.g., /tags or /lectures). Basically we don't want to spread our link juice around to all these pages that are never going to rank. We want to keep it focused on our core pages which are for our courses. We've added them as disallows in robots.txt, but after 2-3 weeks google is still showing them in it's index. When we lookup "site: udemy.com", for example, Google currently shows ~650,000 pages indexed... when really it should only be showing ~5,000 pages indexed. As another example, if you search for "site:udemy.com/tag", google shows 129,000 results. We've definitely added "/tag" into our robots.txt properly, so this should not be happening... Google showed be showing 0 results. Any ideas re: how we get Google to pay attention and re-index our site properly?
Intermediate & Advanced SEO | | udemy0