Best way to noindex an image?
-
Hi all,
A client wanted a few pages noindexed, which was no problem using the meta robots noindex tag.
However they now want associated images removed, some of which still appear on pages that they still want indexed. I added the images to their robots.txt file a few weeks ago (probably over a month ago actually) but they're all still showing when you do an image search.
What's the best way to noindex them for good, and how do I go about implementing it?
Many thanks,
Steve
-
The noindex meta tag keeps the page out of the index but since images have their own url, it doesn't keep them out of the index, so you should use the noimageindex directive on the pages where the images in question reside.
-
Thanks Chris! So, in my instance, should I just add that directive to the 2 pages that feature the images but are still indexed? Or should I also include it to those pages that are noindexed, just to be on the safe side?
-
Steve, As long as no other resources are are linking to your images, you could use the noimageindex directive as such:
-
Thanks Ash. I've just found this Google resource about it - https://support.google.com/webmasters/answer/59819?hl=en - that says "URL removal requests expire after 90 days, after which the content may appear in our search results again." Is that true? Is there a more permanent way or would I/the client have to manually re-remove it every 90 days?
-
You have to remove the images via Google Webmaster Tools. This support document has the instructions: https://support.google.com/webmasters/answer/181721?hl=en
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I use NoIndex on short-lived pages?
Hello, I have a large number of product pages on my site that are relatively short-lived: probably in the region of a million+ pages that are created and then removed within a 24 hour period. Previously these pages were being indexed by Google and did receive landings, but in recent times I've been applying a NoIndex tag to them. I've been doing that as a way of managing our crawl budget but also because the 410 pages that we serve when one of these product pages is gone are quite weak and deliver a relatively poor user experience. We're working to address the quality of those 410 pages but my question is should I be no-indexing these product pages in the first place? Any thoughts or comments would be welcome. Thanks.
Intermediate & Advanced SEO | | PhilipHGray0 -
How important is the file extension in the URL for images?
I know that descriptive image file names are important for SEO. But how important is it to include .png, .jpg, .gif (or whatever file extension) in the url path? i.e. https://example.com/images/golden-retriever vs. https://example.com/images/golden-retriever.jpg Furthermore, since you can set the filename in the Content-Disposition response header, is there any need to include the descriptive filename in the URL path? Since I'm pulling most of our images from a database, it'd be much simpler to not care about simulating a filename, and just reference an image id in my templates. Example: 1. Browser requests GET /images/123456
Intermediate & Advanced SEO | | dsbud
2. Server responds with image setting both Content-Disposition, and Link (canonical) headers Content-Disposition: inline; filename="golden-retriever"
Link: <https: 123456="" example.com="" images="">; rel="canonical"</https:>1 -
2 eCommerce stores that are identical 1 for US 1 for CA, what's the best way to SEO?
Hello everyone! I have an SEO question that I cannot solve given the parameters of the project, and I was wondering if someone could provide me with the next best alternative to my situation. Thank you in advance. The problem: Two eCommerce stores are completely identical (structure, products, descriptions, content) but they are on separate domains for currency and targeting purposes. www.website-can.com is for Canada and www.website-usa.com is for US. Due to exchange rate issues, we are unable to combine the 2 domains into 1 store and optimize. What's been done? I have optimized the Canadian store with unique meta titles and descriptions for every page and every product. However I have left the US store untouched. I would like to gain more visibility for the US Store but it is very difficult to create unique content considering the products are identical. I have evaluated using canonicals but that would ask Google to only look at either the Canadian or US store, , correct me if i'm wrong. I am looking for the next best solution given the challenges and I was wondering if someone could provide me with some ideas.
Intermediate & Advanced SEO | | Snaptech_Marketing0 -
Image URL Change Catastrophe
We have a site with over 3mm pages indexed, and an XML sitemap with over 12mm images (312k indexed at peak). Last week our traffic dropped off a cliff. The only major change we made to the site in that time period was adding a DNS record for all of our images that moved them from a SoftLayer Object Storage domain to a subdomain of our site. The old URLs still work, but we changed all the links from across our site to the new subdomain. The big mistake we made was that we didn't update our XML sitemap to the new URLs until almost a week after the switch (totally forgot that they were served from a process with a different config file). We believe this was the cause of the issue because: The pages that dropped in traffic were the ones where the images moved, while other pages stayed more or less the same. We have some sections of our property where the images are, and have always been, hosted by Amazon and their rankings didn't crater. Same with pages that do not have images in the XML sitemap (like list pages). There wasn't a change in geographic breakdown of our traffic, which we looked at because the timing was around the same time as Pigeon. There were no warnings or messages in Webmaster Tools, to indicate a manual action around something unrelated. The number of images indexed in our sitemap according Webmaster Tools dropped from 312k to 10k over the past week. The gap between the change and the drop was 5 days. It takes Google >10 to crawl our entire site, so the timing seems plausible. Of course, it could be something totally unrelated and just coincidence, but we can't come up with any other plausible theory that makes sense given the timing and pages affected. The XML sitemap was updated last Thursday, and we resubmitted it to Google, but still no real change. Anyone had a similar experience? Any way to expedite the climb back to normal traffic levels? Screen%20Shot%202014-07-29%20at%203.38.34%20PM.png
Intermediate & Advanced SEO | | wantering0 -
Best way to target multiple geographic locations
Hello Mozzers! If you are a service provider wanting to target geographic locations outside of the region where you're physically located, what's the best approach? For example, I have a service provider whose main market is not where they're located - they're based in Devon UK, yet main markets are London, Birmingham, Newcastle, Edinburgh. They have clients in all these cities, so I could definitely provide content relevant to each city - perhaps a page for each city detailing work and services (and possibly listing clients). However, does the lack of a physical presence (and local phone number) in these cities make such city pages virtually impossible to rank these days? Does Google require a physical presence/phone number? Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Best way to block a sub-domain from being indexed
Hello, The search engines have indexed a sub-domain I did not want indexed its on old.domain.com and dev.domain.com - I was going to password them but is there a best practice way to block them. My main domain default robots.txt says :- Sitemap: http://www.domain.com/sitemap.xml global User-agent: *
Intermediate & Advanced SEO | | JohnW-UK
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /trackback/
Disallow: /feed/
Disallow: /comments/
Disallow: /category//
Disallow: */trackback/
Disallow: */feed/
Disallow: /comments/
Disallow: /?0 -
Keyword Targeting Best Practices??
What is the best way to target a specific keyword? I rank well for several of my keywords but want to do better on others. How do I go about doing this?
Intermediate & Advanced SEO | | bronxpad0 -
Best way to de-index content from Google and not Bing?
We have a large quantity of URLs that we would like to de-index from Google (we are affected b Panda), but not Bing. What is the best way to go about doing this?
Intermediate & Advanced SEO | | nicole.healthline0