Can you noindex a page, but still index an image on that page?
-
If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed?
We want to noindex all the pages with just images because they are thin content...
Can you noindex,follow a page, but still index the images on that page?
Please explain how to go about this concept.....
-
In theory, this shouldn't be a problem.
When you use an image on a page, you are embedding that image from somewhere else - it's own URL.
Therefore, if the page itself is "noindex,follow" - that does not apply to the images (or anything else like a video, pdf etc) that is embedded into the page. Provided that the URL holding the image is allowed to be indexed on your root domain, which it will by default, you can index the image.
You can test this out by uploading images to your root domain and then, in the next few days, performing this site search:
site:yoursite.com filetype:jpg (or png/gif/whatever you used)
That will show any of the images that you have on the site that are in the Google index. If that does not work, try this search:
site:yoursite.com inurl:imagefilename
Replace those details with your real site and what you named the image when uploading and it should show you whether or not it's indexed or not. If not, you could try sharing the image on Google+. G+ is ridiculously effective at getting web pages and content indexed.
But just because the page where the image is present on is instructed to be noindexed does not mean that the image itself will inherit that quality. By default, it should be indexed properly. You can also create image sitemaps to help with indexing (I believe you're running Wordpress, Yoast's SEO wordpress plugin can help with this).
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden decrease in indexed AMP pages after 8/1/16 update
After the AMP update on 8/1/16, the number of AMP pages indexed suddenly dropped by about 50% and it's crushing our search traffic- I haven't been able to find any documentation on any changes to look out for and why we are getting a penalty- any advice or something I should look out for?
Technical SEO | | nystromandy0 -
Updating product pages with new images - should I redirect old images ?
Hello,
Technical SEO | | ninjahippo
We have approx 900 products on our website. Over the coming months we will be replacing the product images. At the moment they have file names like 'green_widget_54eb3a78620be.jpg'
the random jumble at the end of the filename was apparently to keep file names unique. We have removed the jumble part and will have file names like:
'black_widget_with_stripe_001.jpg' The CMS removes the old main image when a new main image is uploaded. But we could change this to leave the old image, but not use it. My question is should we: redirect the old file name to the new file name? upload the new image, and leave the old image in place Or do we just ignore.0 -
Problems with to many indexed pages
A client of our have not been able to rank very well the last few years. They are a big brand in our country, have more than 100+ offline stores and have plenty of inbound links. Our main issue has been that they have to many indexed pages. Before we started we they had around 750.000 pages in the Google index. After a bit of work we got it down to 400-450.000. During our latest push we used the robots meta tag with "noindex, nofollow" on all pages we wanted to get out of the index, along with canonical to correct URL - nothing was done to robots.txt to block the crawlers from entering the pages we wanted out. Our aim is to get it down to roughly 5000+ pages. They just passed 5000 products + 100 categories. I added this about 10 days ago, but nothing has happened yet. Is there anything I can to do speed up the process of getting all the pages out of index? The page is vita.no if you want to have a look!
Technical SEO | | Inevo0 -
Why blocking a subfolder dropped indexed pages with 10%?
Hy Guys, maybe you can help me to understand better: on 17.04 I had 7600 pages indexed in google (WMT showing 6113). I have included in the robots.txt file, Disallow: /account/ - which contains the registration page, wishlist, etc. and other stuff since I'm not interested to rank with registration form. on 23.04 I had 6980 pages indexed in google (WMT showing 5985). I understand that this way I'm telling google I don't want that section indexed, by way so manny pages?, Because of the faceted navigation? Cheers
Technical SEO | | catalinmoraru0 -
Can I use a 410'd page again at a later time?
I have old pages on my site that I want to 410 so they are totally removed, but later down the road if I want to utilize that URL again, can I just remove the 410 error code and put new content on that page and have it indexed again?
Technical SEO | | WebServiceConsulting.com0 -
Do pages that are in Googles supplemental index pass link juice?
I was just wondering if a page has been booted into the supplemental index for being a duplicate for example (or for any other reason), does this page pass link juice or not?
Technical SEO | | FishEyeSEO0 -
How can I have pages with media that changes and avoid duplicate content when the text stays the same?
I want to have a page that describes a specific property and/or product. The top part of the page has media options such as video and photos while the bottom includes the description. I know I can set up the media in tabs and have it separated by javascript, but everything resides on one page so there are no duplicate content issues. Example: http://www.worldclassproperties.com/properties/Woodside BUT what if I need to the photos and the videos to have separate URLs so I can link to them individually? For example, for a real estate site blog, I may want to send visitors to the page of the home tour. I don't want to link them to the version of the page with the photos because I want them to arrive on the video portion. Example: http://www.worldclassproperties.com/properties/Woodside?video=1 Is there any way to get around the problem that would result from the duplicate content of the product/property description? I do not have the resources in the budget to make two unique descriptions for every page.
Technical SEO | | WebsightDesign0 -
Why am i still getting duplicate page title warnings after implementing canonical URLS?
Hi there, i'm having some trouble understanding why I'm still getting duplicate page title warnings on pages that have the rel=canonical attribute. For example: this page is the relative url http://www.resnet.us/directory/auditor/az/89/home-energy-raters-hers-raters/1 and http://www.resnet.us/directory/auditor/az/89/home-energy-raters-hers-raters/2 is the second page of this parsed list which is linking back to the first page using rel=canonical. i have over 300 pages like this!! what should i do SEOmoz GURUS? how do i remedy this problem? is it a problem?
Technical SEO | | fourthdimensioninc0