Google only indexed 19/94 images
-
I'm using Yoast SEO and have images (attachments) excluded from sitemaps, which is the recommended method (but could this be wrong?).
Most of my images are in my posts; here's the sitemap for posts: https://edwardsturm.com/post-sitemap.xml
I also appear on p1 for some good keywords, and my site is getting organic traffic, so I'm not sure why the images aren't being indexed. Here's an example of a well performing article: https://edwardsturm.com/best-games-youtube-2016/
Thanks!
-
Thanks for following up!
-
For anybody wondering, the problem is stemming from me using the Photon CDN by the Jetpack plugin. Photon is great, it really enhances UX, and their support is pretty top of the line, but since I put a lot of work into my post images, I'd like them to be indexed.
I'm going to try switching to AWS, because I see that that's what Moz is using.
-
Thanks Dan. I've figured out that I'm having problems because I'm using a CDN. I'm finding a workaround to this right now.
-
Edward
In the big picture, Martijin is right - unless you need traffic from the images, it's OK they are not indexed.
But secondly, looks like they are hosted on wp.com like this one - https://i2.wp.com/edwardsturm.com/wp-content/uploads/2016/03/Reddit-Gaming-Front-Page.png?resize=860%2C883&ssl=1
They show up in my crawl OK, but have you tried a "fetch and render" in Google Search Console? I would do that on any of your blog posts and see if the render shows the images.
My guess is it's not an issue for you, but maybe something wp.com is blocking access to, or not allowing indexation on.
-
Thanks. The strange thing is that it appears that only my post images aren't being indexed. My Facebook and Twitter images are, but the images that appear in the actual post are not, making it seem like this is purely a technical issue.
-
Hi Edward,
No, it is recommended not to list your attachment pages as the quality of these pages are very low so that's why you don't want them to be included in the SERPs.
Usually image indexation can be a problem for bigger sites. Our indexation rate is a little bit bigger then yours but for now we also have less then 50% of our images indexed and that's tens of thousands of images at the moment. Somehow Google doesn't find most of the images relevant to its users. What you could do to improve this is make sure there is always some relevant information around the image available.Hope this helps!?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google not Indexing images on CDN.
My URL is: https://bit.ly/2hWAApQ We have set up a CDN on our own domain: https://bit.ly/2KspW3C We have a main xml sitemap: https://bit.ly/2rd2jEb and https://bit.ly/2JMu7GB is one the sub sitemaps with images listed within. The image sitemap uses the CDN URLs. We verified the CDN subdomain in GWT. The robots.txt does not restrict any of the photos: https://bit.ly/2FAWJjk. Yet, GWT still reports none of our images on the CDN are indexed. I ve followed all the steps and still none of the images are being indexed. My problem seems similar to this ticket https://bit.ly/2FzUnBl but however different because we don't have a separate image sitemap but instead have listed image urls within the sitemaps itself. Can anyone help please? I will promptly respond to any queries. Thanks
Technical SEO | | TNZ
Deepinder0 -
My client is using a mobile template for their local pages and the Google search console is reporting thousands of duplicate titles/meta descriptions
So my client has 2000+ different store locations. Each location has the standard desktop location and my client opted for a corresponding mobile template for each location. Now the Google search console is reporting thousands of duplicate titles/meta descriptions. However this is only because the mobile template and desktop store pages are using the exact same title/meta description tag. Is Google penalizing my client for this? Would it be worth it to update the mobile template title/meta description tags?
Technical SEO | | RosemaryB0 -
Local Google vs. default Google search
Hello Moz community, I have a question: what is the difference between a local version of Google vs. the default Google in regards to search results? I have a Mexican site that I'm trying to rank in www.google.com.mx, but my rankings are actually better if I check my keywords on www.google.com The domain is a .mx site, so wouldn't it make more sense that this page would rank higher on google.com.mx instead of the default Google site, which in theory would mean a "broader" scope? Also, what determines whether a user gets automatically directed to a local Google version vs. staying on the default one? Thanks for your valuable input!
Technical SEO | | EduardoRuiz0 -
Why did Google stop indexing my site?
Google used to crawl my site every few minutes. Suddenly it stopped and the last week it indexed 3 pages out of thousands. https://www.google.co.il/#q=site:www.yetzira.com&source=lnt&tbs=qdr:w&sa=X&ei=I9aTUfTTCaKN0wX5moCgAw&ved=0CBgQpwUoAw&bav=on.2,or.r_cp.r_qf.&fp=cfac44f10e55f418&biw=1829&bih=938 What could cause this to happen and how can I solve this problem? Thanks!
Technical SEO | | JillB20130 -
WordPress - How to stop both http:// and https:// pages being indexed?
Just published a static page 2 days ago on WordPress site but noticed that Google has indexed both http:// and https:// url's. Usually I only get http:// indexed though. Could anyone please explain why this may have happened and how I can fix? Thanks!
Technical SEO | | Clicksjim1 -
/index.php/ page
I was wondering if my system creates this page www my domain com/index.php/ is it better to block with robot.txt or just canonize?
Technical SEO | | ciznerguy0 -
Can JavaScrip affect Google's index/ranking?
We have changed our website template about a month ago and since then we experienced a huge drop in rankings, especially with our home page. We kept the same url structure on entire website, pretty much the same content and the same on-page seo. We kind of knew we will have a rank drop but not that huge. We used to rank with the homepage on the top of the second page, and now we lost about 20-25 positions. What we changed is that we made a new homepage structure, more user-friendly and with much more organized information, we also have a slider presenting our main services. 80% of our content on the homepage is included inside the slideshow and 3 tabs, but all these elements are JavaScript. The content is unique and is seo optimized but when I am disabling the JavaScript, it becomes completely unavailable. Could this be the reason for the huge rank drop? I used the Webmaster Tolls' Fetch as Googlebot tool and it looks like Google reads perfectly what's inside the JavaScrip slideshow so I did not worried until now when I found this on SEOMoz: "Try to avoid ... using javascript ... since the search engines will ... not indexed them ... " One more weird thing is that although we have no duplicate content and the entire website has been cached, for a few pages (including the homepage), the picture snipet is from the old website. All main urls are the same, we removed some old ones that we don't need anymore, so we kept all the inbound links. The 301 redirects are properly set. But still, we have a huge rank drop. Also, (not sure if this important or not), the robots.txt file is disallowing some folders like: images, modules, templates... (Joomla components). We still have some html errors and warnings but way less than we had with the old website. Any advice would be much appreciated, thank you!
Technical SEO | | echo10 -
Crawling image folders / crawl allowance
We recently removed /img and /imgp from our robots.txt file thus allowing googlebot to crawl our image folders. Not sure why we had these blocked in the first place, but we opened them up in response to an email from Google Product Search about not being able to crawl images - which can/has hurt our traffic from Google Shopping. My question is: will allowing Google to crawl our image files eat up our 'crawl allowance'? We wouldn't want Google to not crawl/index certain pages, and ding our organic traffic, because more of our allotted crawl bandwidth is getting chewed up crawling image files. Outside of the non-detailed crawl stat graphs from Webmaster Tools, what's the best way to check how frequently/ deeply our site is getting crawled? Thanks all!
Technical SEO | | evoNick0