Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
-
Hi!
I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us.
The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx
I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this:
User-agent: googlebot
Disallow: /community/photos/
Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible?
Thanks!
Leona
-
Are you seeing the images getting indexed, though? Even if GWT recognize the Robots.txt directives, blocking the pages may essentially keep the images from having any ranking value. Like Matt, I'm not sure this will work in practice.
Another option would be to create an alternate path to just the images, like an HTML sitemap with just links to those images and decent anchor text. The ranking power still wouldn't be great (you'd have a lot of links on this page, most likely), but it would at least kick the crawlers a bit.
-
Thanks Matt for your time and assistance! Leona
-
Hi Leona - what you have done is something along the lines of what I thought would work for you - sorry if I wasn't clear in my original response - I thought you meant if you created a robots.txt and specified Googlebot to be disallowed then Googlebot-image would pick up the photos still and as I said this wouldn't be the case as it Googlebot-image will follow what it set out for Googlebot unless you specify otherwise using the allow directive as I mentioned. Glad it has worked for you - keep us posted on your results.
-
Hi Matt,
Thanks for your feedback!
It is not my belief that Googlebot overwrides googlebot-images otherwise specifying something for a specific bot of Google's wouldn't work, correct?
I setup the following:
User-agent: googlebot
Disallow: /community/photos/
User-agent: googlebot-Image
Allow: /community/photos/
I tested the results in Google Webmaster Tools which indicated:
Googlebot: Blocked by line 26: Disallow: /community/photos/Detected as a directory; specific files may have different restrictions
Googlebot-Image: Allowed by line 29: Allow: /community/photos/Detected as a directory; specific files may have different restrictions
Thanks for your help!
Leona
-
Hi Leona
Googlebot-image and any of the other bots that Google uses follow the rules set out for Googlebot so blocking Googlebot would block your images as it overrides Googlebot-image. I don't think that there is a way around this using the disallow directive as you are blocking the directory which contains your images so they won't be indexed using specific images.
Something you may want to consider is the Allow directive -
Disallow: /community/photos/
Allow: /community/photos/~username~/
that is if Google is already indexing images under the username path?
The allow directive will only be successful if it contains more or equal number of characters as the disallow path, so bare in mind that if you had the following;
Disallow: /community/photos/
Allow: /community/photos/
the allow will win out and nothing will be blocked. please note that i haven't actioned the allow directive myself but looked into it in depth when i studied the robots.txt for my own sites it would be good if someone else had an experience of this directive. Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Making Filtered Search Results Pages Crawlable on an eCommerce Site
Hi Moz Community! Most of the category & sub-category pages on one of our client's ecommerce site are actually filtered internal search results pages. They can configure their CMS for these filtered cat/sub-cat pages to have unique meta titles & meta descriptions, but currently they can't apply custom H1s, URLs or breadcrumbs to filtered pages. We're debating whether 2 out of 5 areas for keyword optimization is enough for Google to crawl these pages and rank them for the keywords they are being optimized for, or if we really need three or more areas covered on these pages as well to make them truly crawlable (i.e. custom H1s, URLs and/or breadcrumbs)…what do you think? Thank you for your time & support, community!
Intermediate & Advanced SEO | | accpar0 -
Can't understand our Ranking; DA, PA and On Page all better than Competiton; Ranking no where
Our rankings are up and down but our domain is clean, DA and PA good
Intermediate & Advanced SEO | | AlexSTUDIO18
and there is really in depth content which is all original. We are at a bit of a loss; The site is
http://www.fightstorepro.com I can use the phrase "Boxing Gloves" as an example
http://www.fightstorepro.com/gear/gloves/boxing-gloves.html PA 26 , DA29
Good original content; Video content, On page grade A
Not ranking in top 50 places?? The competitor in Pos4 is not matching our placing Anyone shed any light on this?0 -
meta robots no follow on page for paid links
Hi I have a page containing paid links. i would like to add no follow attribute to these links
Intermediate & Advanced SEO | | Kung_fu_Panda
but from technical reasons, i can only place meta robots no follow on page level (
is that enough for telling Google that the links in this page are paid and and to prevent Google penlizling the sites that the page link to? Thanks!0 -
Can SPA (single page architecture) websites be SEO friendly?
What is the latest consensus on SPA web design architecture and SEO friendliness?
Intermediate & Advanced SEO | | Robo342
By SPA, I mean rather than each page having its own unique URL, instead each page would have an anchor added to a single URL. For example: Before SPA: website.com/home/green.html After SPA: website.com/home.html#green (rendering a new page using AJAX) It would seem that Google may have trouble differentiating pages with unique anchors vs unique URLs, but have they adapted to this style of architecture yet? Are there any best practices around this? Some developers are moving to SPA as the state of the art in architecture (e.g., see this thread: http://www.linkedin.com/groups/Google-crawling-websites-built-using-121615.S.219120193), and yet there may be a conflict between SPA and SEO. Any thoughts or black and white answers? Thanks.0 -
Product with two common names: A separate page for each name, or both on one page?
This is a real-life problem on my ecommerce store for the drying rack we manufacture: Some people call it a Clothes Drying Rack, while others call it a Laundry Drying Rack, but it's really the same thing. Search volume is higher for the clothes version, so give it the most attention. I currently have 2 separate pages with the On-Page optimization focused on each name (URL, Title, h1, img alts, etc) Here the two drying rack pages: clothes focused page and laundry focused page But the ranking of both pages is terrible. The fairly generic homepage shows up instead of the individual pages in Google searches for the clothes drying rack and for laundry drying rack. But I can get the individual page to appear in a long-tail search like this: round wooden clothes drying rack So my thought is maybe I should just combine both of these pages into one page that will hopefully be more powerful. We would have to set up the On-Page optimization to cover both "clothes & laundry drying rack" but that seems possible. Please share your thoughts. Is this a good idea or a bad idea? Is there another solution? Thanks for your help! Greg
Intermediate & Advanced SEO | | GregB1230 -
Location appearing on search result. how can this be achieved?
I'm pretty sure this site is not doing any SEO but i think what made them no. 1 is the location. I already tried adding a google publisher tag to my site that points to my google page which contains my address but i still can't have the location appear.. here's a screenshot of the search result that i want to achieve: https://www.dropbox.com/s/tbdv3121rrs6zp5/Screen Shot 2013-04-15 at 9.39.30 AM.png Screen%20Shot%202013-04-15%20at%209.39.30%20AM.png
Intermediate & Advanced SEO | | optimind0 -
Landing Page - Home Page redesign SEO factor question - Serious concern.
Hi Folks, I'm considering making a big change to our website and really need some expert advise. Will we lose ranking if we do what I propose? Our site www.meninkilts.com, needs to split users/clients by "Commercial" and "Residential" so we can message/market completely differently to each client. We are considering doing this structure: Landing Page | | Commercial Homepage Residential Homepage Right now we rank well, for our keywords like "Window Cleaning cityname" but are worried that adding a landing page, and splitting our site to two homepages will effect seo (ie: a landing page would only have two buttons: one for commercial and one for residential). What would be the best way to handle this. Looking forward to your insights! Cheers Brent
Intermediate & Advanced SEO | | MenInKilts0 -
Duplicate page content and Duplicate page title errors
Hi, I'm new to SeoMoz and to this forum. I've started a new campaign on my site and got back loads of error. Most of them are Duplicate page content and Duplicate page title errors. I know I have some duplicate titles but I don't have any duplicate content. I'm not a web developer and not so expert but I have the impression that the crawler is following all my internal links (Infact I have also plenty of warnings saying "Too many on-page links". Do you think this is the cause of my errors? Should I implement the nofollow on all internal links? I'm working with Joomla. Thanks a lot for your help Marco
Intermediate & Advanced SEO | | marcodublin0