Google only indexed 19/94 images
-
I'm using Yoast SEO and have images (attachments) excluded from sitemaps, which is the recommended method (but could this be wrong?).
Most of my images are in my posts; here's the sitemap for posts: https://edwardsturm.com/post-sitemap.xml
I also appear on p1 for some good keywords, and my site is getting organic traffic, so I'm not sure why the images aren't being indexed. Here's an example of a well performing article: https://edwardsturm.com/best-games-youtube-2016/
Thanks!
-
Thanks for following up!
-
For anybody wondering, the problem is stemming from me using the Photon CDN by the Jetpack plugin. Photon is great, it really enhances UX, and their support is pretty top of the line, but since I put a lot of work into my post images, I'd like them to be indexed.
I'm going to try switching to AWS, because I see that that's what Moz is using.
-
Thanks Dan. I've figured out that I'm having problems because I'm using a CDN. I'm finding a workaround to this right now.
-
Edward
In the big picture, Martijin is right - unless you need traffic from the images, it's OK they are not indexed.
But secondly, looks like they are hosted on wp.com like this one - https://i2.wp.com/edwardsturm.com/wp-content/uploads/2016/03/Reddit-Gaming-Front-Page.png?resize=860%2C883&ssl=1
They show up in my crawl OK, but have you tried a "fetch and render" in Google Search Console? I would do that on any of your blog posts and see if the render shows the images.
My guess is it's not an issue for you, but maybe something wp.com is blocking access to, or not allowing indexation on.
-
Thanks. The strange thing is that it appears that only my post images aren't being indexed. My Facebook and Twitter images are, but the images that appear in the actual post are not, making it seem like this is purely a technical issue.
-
Hi Edward,
No, it is recommended not to list your attachment pages as the quality of these pages are very low so that's why you don't want them to be included in the SERPs.
Usually image indexation can be a problem for bigger sites. Our indexation rate is a little bit bigger then yours but for now we also have less then 50% of our images indexed and that's tens of thousands of images at the moment. Somehow Google doesn't find most of the images relevant to its users. What you could do to improve this is make sure there is always some relevant information around the image available.Hope this helps!?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing blocked content in robots.txt
Hi,Google is indexing some URLs that i don't want to be indexed and also is indexing the same URLs with https. This URLs are blocked in the file robots.txt.I've tried to block this URLs through Google WebmasterTools but Google doesn't let me do it because this URL are httpsThe file robots.txt is correct so, what can i do to avoid this content to be indexed?
Technical SEO | | elisainteractive0 -
Does hidden text, which appears for an onclick event, get indexed by Google and what SEO impact does this have?
I'm trying to simplify a conversion process with an onclick event to show text rather than having a completely separate page, but wondering if this is going to negatively impact on SEO, especially considering it's hidden text. I've seen a couple of things out there where you could position the text off the screen and the onclick results in it coming on.
Technical SEO | | JuiceBoxOM0 -
Disallow: /search/ in robots but soft 404s are still showing in GWT and Google search?
Hi guys, I've already added the following syntax in robots.txt to prevent search engines in crawling dynamic pages produce by my website's search feature: Disallow: /search/. But soft 404s are still showing in Google Webmaster Tools. Do I need to wait(it's been almost a week since I've added the following syntax in my robots.txt)? Thanks, JC
Technical SEO | | esiow20130 -
Best way to handle indexed pages you don't want indexed
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links. I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google. At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings. Would a redirect rule work or could someone offer any advice? https://www.google.co.uk/search?q=site:outdoormegastore.co.uk+inurl:default&num=100&hl=en&safe=off&prmd=imvnsl&filter=0&biw=1600&bih=809#hl=en&safe=off&sclient=psy-ab&q=site:outdoormegastore.co.uk+inurl%3Aajax&oq=site:outdoormegastore.co.uk+inurl%3Aajax&gs_l=serp.3...194108.194626.0.194891.4.4.0.0.0.0.100.305.3j1.4.0.les%3B..0.0...1c.1.SDhuslImrLY&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.&fp=ff301ef4d48490c5&biw=1920&bih=860
Technical SEO | | gavinhoman0 -
Url canonicalization: www. to http://
Hey there. Sorry for the simple question but I recently redesigned a site and published with WordPress, in the process the domain structure changed from being www. to http:// . My question is does this change affect the value we get from links pointing to the old www. domain structure? The reason I ask is that the old site had a domain authority of 36 with OSE and a couple of hundred links but the new site address shows as having zero domain authority and zero links. Is there some best practise I should be following to retain link value?
Technical SEO | | Luia0 -
Mobile Google Not Indexing Mobile Website
Google currently does not index our mobile website. It has the WWW website in it's index. When a user from a mobile phone clicks on a mobile search result for WWW we redirect them to our mobile website. This is posing problems for us as our mobile website is a fraction of the # of pages/sections as our WWW. So for example, mobile search results show that we have a "careers" section; but that's not the case for the mobile website. As a result a user gets a 404. How do we force mobile Google to index our mobile website instead of our WWW?
Technical SEO | | RBA0 -
Duplicate Homepage: www.mysite.com/ and www.mysite.com/default.aspx
Hi, I have a question regarding our client's site, http://www.outsolve-hr.com/ on ASP.net. Google has indexed both www.outsolve-hr.com/ and www.outsolve-hr.com/default.aspx creating a duplicate content issue. We have added
Technical SEO | | flarson
to the default.aspx page. Now, because www.outsolve-hr.com/ and www.outsolve-hr.com/default.aspx are the same page on the actual backend the code is on the http://www.outsolve-hr.com/ when I view the code from the page loaded in a brower. Is this a problem? Will Google penalize the site for having the rel=canonical on the actual homepage...the canonical url. We cannot do a 301 redirect from www.outsolve-hr.com/default.aspx to www.outsolve-hr.com/ because this causes an infinite loop because on the backend they are the same page. So my question is two-fold: Will Google penalize the site for having the rel=canonical on the actual homepage...the canonical url. Is the rel="canonical" the best solution to fix the duplicate homepage issue on ASP. And lastly, if Google has not indexed duplicate pages, such as https://www.outsolve-hr.com/DEFAULT.aspx, is it a problem that they exist? Thanks in advance for your knowledge and assistance. Amy0