Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Images Not Indexing? (Nudity Warning!) - Before & After Photos
-
One of our clients is in the Cosmetic Surgery business (bodevolve.com) and individuals most likely to purchase a cosmetic procedure only search for 2 things....'**before & after photos' and 'cost'. **
That being said we've worked extremely hard to optimize all 500+ before and after photos. And to our great disappointment, they still aren't being indexed...we are testing a few things but any feedback would be greatly appreciated!
All photos are in the 'attachment' sitemap: http://bodevolve.com/sitemap_index.xml
I'm also testing a few squeeze pages like this one: http://bodevolve.com/tummy-tuck-before-and-after-photos/
Thanks so much,
Brit
-
hi Britney Muller
What are you doing so your images indexed well, i have the same problem... thanks
-
It's also worth mentioning that Google gets SO MANY of these before and after requests they have created a segmented carousel (Dr. Pete would prob punch me for using that name haha!) -Nudity is not an issue for these indexed / segmented photos and greatly improves UX.
THANKS PEOPLE!!

Peace, Love & Grandma Hugs
-
Great example! We haven't been flagged for any nudity but it's great to be aware of situations like that. Thanks so much
-
Image names have been optimized, are you referring to the "?attachment_id=117" that WP auto assigns uploads?
Are you familiar with a custom code we could use to alter that auto attachment name?
Thanks so much
-
Hi MoosaHemani,
Thanks so much for your response! -Only a small % of the images (ones that have been embedded on pages are indexed), while we are trying to make sure the before and after photos are being properly indexed. -All names of photos have been optimized aside from WP's auto ?=attachment_blahblah thing.
Also, how would we "use natural anchor text instead of targeting keywords all the time" for photos...the primary website is naturally linked to things however not sure what you mean for the photos?
Thanks,
B
-
Also my direct experience tell changing the image name greatly improve image ranking in serp.
-
When there is a nudity warning it will not be under our control to index the images. Here is an example for similar query
-
As far as I see your website I see images are getting indexed (not all of them but they are in the Google search index). Why are they not appearing in the search results when someone type in related keyword is a different question. If you are looking for that my advice would be to use proper, optimized and natural anchor text instead of targeting keywords all the time.
Also try to change the name of the images to something more relevant. “liposuction.jpg is much better than big-imagejpg”
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mass Removal Request from Google Index
Hi, I am trying to cleanse a news website. When this website was first made, the people that set it up copied all kinds of articles they had as a newspaper, including tests, internal communication, and drafts. This site has lots of junk, but this kind of junk was on the initial backup, aka before 1st-June-2012. So, removing all mixed content prior to that date, we can have pure articles starting June 1st, 2012! Therefore My dynamic sitemap now contains only articles with release date between 1st-June-2012 and now Any article that has release date prior to 1st-June-2012 returns a custom 404 page with "noindex" metatag, instead of the actual content of the article. The question is how I can remove from the google index all this junk as fast as possible that is not on the site anymore, but still appears in google results? I know that for individual URLs I need to request removal from this link
Intermediate & Advanced SEO | | ioannisa
https://www.google.com/webmasters/tools/removals The problem is doing this in bulk, as there are tens of thousands of URLs I want to remove. Should I put the articles back to the sitemap so the search engines crawl the sitemap and see all the 404? I believe this is very wrong. As far as I know this will cause problems because search engines will try to access non existent content that is declared as existent by the sitemap, and return errors on the webmasters tools. Should I submit a DELETED ITEMS SITEMAP using the <expires>tag? I think this is for custom search engines only, and not for the generic google search engine.
https://developers.google.com/custom-search/docs/indexing#on-demand-indexing</expires> The site unfortunatelly doesn't use any kind of "folder" hierarchy in its URLs, but instead the ugly GET params, and a kind of folder based pattern is impossible since all articles (removed junk and actual articles) are of the form:
http://www.example.com/docid=123456 So, how can I bulk remove from the google index all the junk... relatively fast?0 -
Display:None CSS & SEO
Hi A while back I was told that using the display:none tag to hide content you want minimised is bad for onpage SEO - is this the case? It's not that we want to hide it from Google, we just don't want it taking up a huge amount of space on product pages. I have found some of these on our site, and want to know how bad they are. Is the content under the tag going to be ignored? Thank you
Intermediate & Advanced SEO | | BeckyKey0 -
Is it worth creating an Image Sitemap?
We've just installed the server side script 'XML Sitemaps' on our eCommerce site. The script gives us the option of (easily) creating an image sitemap but I'm debating whether there is any reason for us to do so. We sell printer cartridges and so all the images will be pretty dry (brand name printer cartridge in front of a box being a favourite). I can't see any potential customers to search for an image as a route in to the site and Google appears to be picking up our images on it's own accord so wonder if we'll just be crawling the site and submitting this information for no real reason. From a quality perspective would Google give us any kind of kudos for providing an Image Sitemap? Would it potentially increase their crawl frequency or, indeed, reduce the load on our servers as they wouldn't have to crawl for all the images themselves?
Intermediate & Advanced SEO | | ChrisHolgate
I can't stress how little of a hardship it will be to create one of these automatically daily but am wondering if, like Meta Keywords, there is any benefit to doing so?1 -
Google Indexing Feedburner Links???
I just noticed that for lots of the articles on my website, there are two results in Google's index. For instance: http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html and http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+thewebhostinghero+(TheWebHostingHero.com) Now my Feedburner feed is set to "noindex" and it's always been that way. The canonical tag on the webpage is set to: rel='canonical' href='http://www.thewebhostinghero.com/articles/tools-for-creating-wordpress-plugins.html' /> The robots tag is set to: name="robots" content="index,follow,noodp" /> I found out that there are scrapper sites that are linking to my content using the Feedburner link. So should the robots tag be set to "noindex" when the requested URL is different from the canonical URL? If so, is there an easy way to do this in Wordpress?
Intermediate & Advanced SEO | | sbrault740 -
Wordpress blog in a subdirectory not being indexed by Google
HI MozzersIn my websites sitemap.xml, pages are listed, such as /blog/ and /blog/textile-fact-or-fiction-egyptian-cotton-explained/These pages are visible when you visit them in a browser and when you use the Google Webmaster tool - Fetch as Google to view them (see attachment), however they aren't being indexed in Google, not even the root directory for the blog (/blog/) is being indexed, and when we query:site: www.hilden.co.uk/blog/ It returns 0 results in Google.Also note that:The Wordpress installation is located at /blog/ which is a subdirectory of the main root directory which is managed by Magento. I'm wondering if this causing the problem.Any help on this would be greatly appreciated!AnthonyToTOHuj.png?1
Intermediate & Advanced SEO | | Tone_Agency0 -
Image Maps
Hey forum, I'm curious about Image Maps. Few things I'm not sure about: 1. Will the links be followed? If so, will Google respect rel="nofollow"? 2. Will the image be considered 1 image? (indexed as image, etc.) Or will each map segment be treated as a separate image? 3. Any other SEO pros\cons to consider when adding an image map to an existing page? Thanks, Corwin.
Intermediate & Advanced SEO | | corwin0 -
Hosting images on multiple domains
I'm taking the following from http://developer.yahoo.com/performance/rules.html "Splitting components allows you to maximize parallel downloads. Make sure you're using not more than 2-4 domains because of the DNS lookup penalty. For example, you can host your HTML and dynamic content on www.example.org and split static components between static1.example.org and static2.example.org" What I want to do is load page images (it's an eCommerce site) from multiple sub domains to reduce load times. I'm assuming that this is perfectly OK to do - I cannot think of any reason that this wouldn't be a good tactic to go with. Does anyone know of (or can think of) a reason why taking this approach could be in any way detrimental. Cheers mozzers.
Intermediate & Advanced SEO | | eventurerob0 -
How to resolve Duplicate Page Content issue for root domain & index.html?
SEOMoz returns a Duplicate Page Content error for a website's index page, with both domain.com and domain.com/index.html isted seperately. We had a rewrite in the htacess file, but for some reason this has not had an impact and we have since removed it. What's the best way (in an HTML website) to ensure all index.html links are automatically redirected to the root domain and these aren't seen as two separate pages?
Intermediate & Advanced SEO | | ContentWriterMicky0