All images are noindex will opening this at once be an issue?
-
Hi,
All images are noindex will opening this at once be an issue?
Not sure how a few months ago all my images were set as noindex which i realized last week. We have 20K images which were indexed fine but now when i check
Site:sitename it shows 10 or 12 and the inspect element via Chrome i see the noindex is set for all images. We have been renaming the images and adding ALT tags for most of them and would it be an issue if we change the noindex in one shot or should we do them few at a time?
Thanks
-
Hi,
I can share from my personal experience, we had noindex to images earlier - though it was much lower as against yours as just 1500 images. after removing noindex from it - there was no issue at all
The images started showing in google images after 2 weeks and almost after 6 weeks - we could start seeing most of them. You are doing correct by using Alt attribute, only thing is naming the image correctly (for ex, if someone searches for honda city then image should contain name as honda-city.jpg or like this along with Alt text)
-
Thanks
label images is being done as image-name-that-it-represents.jpg
5+ words of file name and 5+ words of ALT keys would that be fine?
-
I don't think anything will happen. You're conversion rate may drop because 20K images will now be available in image search. I remember once I started ranking for the keyword: car. I saw a big increase in organic, but a drop in conversions. Try to be careful how you label your images. Also, make annotation in GA so you can compare before and after.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Facebook widget and blocked images
A Wordpress site has a footer widget for facebook with some images, all of which are served within an iframe. The FB CDN robots is blocking the images from being crawled so Webmaster Tools rendering tool is reporting these 8 or so images as blocked. Should I be concerned?
Technical SEO | | MickEdwards0 -
Issue with duplicate content
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot. I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue. Thank you!
Technical SEO | | odmsoft0 -
Redundant Hostnames Issue in GA
I noticed another post on this, but I have another question. I am getting this message from Analytics: Property http://www.example.com is receiving data from redundant hostnames. Consider setting up a 301 redirect on your website, or make a search and replace filter that strips "www." from hostnames. Examples of redundant hostnames: example.com, www.example.com. We don't have a 301 in place that manages this and I am quite concerned about handling that the right way. We do have a canonical on our homepage that says: rel="canonical" href="http://www.example.com/" /> I asked on another site how to safely set up our 301 and I got this response: RewriteCond %{HTTP_HOST} !^www.example.com$ [NC]
Technical SEO | | TheCraig
RewriteRule ^ http://www.example.com%{REQUEST_URI} [R=301,L,NE] Is this the best way of handling it? Are there situations where this would not be the best way? We do have a few subdomains like beta.example.com in use and have a rather large site, so I just want to make sure I get it right. Thanks for your help! Craig0 -
GWT Images Indexing
Hi guys! How does normally take to get Google to index the images within the sitemap? I recently submitted a new, up to date sitemap and most of the pages have been indexed already, but no images have. Any reason for that? Cheers
Technical SEO | | PremioOscar0 -
Duplicate Content Issue
My issue with duplicate content is this. There are two versions of my website showing up http://www.example.com/ http://example.com/ What are the best practices for fixing this? Thanks!
Technical SEO | | OOMDODigital0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Severe Health issue on my site through Webmaster tools
I use Go Daddy Website Tonight. I keep getting a severe health message in Google Webmaster tools stating that my robots.txt file is blocking some important page. When I try to get more details the blocked file will not open. When I asked the Go Daddy peeps they told me that it was just image and backup files that do not need to be crawled. But if Google spiders keep thinking an important page is blocked will this hurt my SERPS?
Technical SEO | | VictorVC0 -
How to noindex lots of content properly : bluntly or progressively ?
Hello Mozers ! I'm quite in doubt, so I thought why not ask for help ? Here's my problem : I need to better up a website's SEO that consists in a lot (1 million+ pages) of poor content. Basically it's like a catalog, where you select a brand, a product series and then the product, to then fill out a form for a request (sorry for the cryptic description, I can't be more precise). Beside the classic SEO work, a part of what (I think) I need to do is noindex some useless pages and rewrite important ones with great content, but for the noindexing part I'm quite hesitant on the how. There's like 200 000 pages with no visits since a year, so I guess they're pretty much useless junk that would be better off in noindex. But the webmaster is afraid that noindexing that much pages will hurt its long tail (in case of future visits), so he wants to check the SERP position of every one of them, to only eliminate those that are in the top 3 (for these there's no hope of amelioration he thinks). I think it would be wasting a lot of time and resources for nothing, and I'd advise to noindex them regardless of their position. The problem is I lack the experience to be sure of it and how to do it : Is it wise to noindex 200 000 pages bluntly in one time (isn't it a bad signal for google ?) or should we do this progressively in like a few months ? Thanks a lot for your help ! Johann.
Technical SEO | | JohannCR0