What is the best way to eliminate this specific image low lying content?
-
The site in question is www.homeanddesign.com where we are working on recovering from some big traffic loss.
I finally have gotten the sites articles properly meta titled and descriptioned now I'm working on removing low lying content.
The way there CMS is built, images have their own page (every one that's clickable). So this leads to a lot of thin content that I think needs to be removed from the index. Here is an example:
http://www.homeanddesign.com/photodisplay.asp?id=3633
I'm considering the best way to remove it from the index but not disturb how users enjoy the site.
What are my options? Here is what I'm thinking:
-
add Disallow: /photodisplay to the robots.txt file
-
See if there is a way to make a lightbox instead of a whole new page for images. But this still leaves me with 100s of pages with just an image on there with backlinks, etc.
-
Add noindex tag to the photodisplay pages
-
-
Disallow: /photodisplay.asp?*
That should do it. But just to be safe you can add another one for:
Disallow: /photodisplay.asp
There is very, very, very, very little danger of you blocking your entire site from being crawled if you add those disallow statements to your robots.txt file. If you're an SEO your job is to "mess with" the robots.txt file. Furthermore, trying to dynamically change the robots meta tag to noindex based on page-type is going to be much more tricky and potentially dangerous than adding a line to the robots.txt file.
Don't forget to remove the pages from the index using the URL removal tool in GWT once the block has been added.
Also I'd stop linking to those pages. It is best practice not to link to pages that you don't want indexed if you can help it. I'd go the lightbox route you mentioned above. This is something I do on my Wordpress sites too.
Good luck!
-
Hi WIlliam,
I would personally go the route of adding the noindex tag to the photo pages. Messing with the robots.txt file would probably be quicker; however, I am a little hesitant about messing with the robots.txt tag if I don't have to... one slip and you could be blocking your whole site or an entire directory from being crawled vs specifically calling out each individual page using the noindex tag.
Lightboxes are fine, but like you say, you aren't really solving the problem of tons of other pages.
You could look into your CMS and see if there is a way to remove the automatically generated link to photodisplay.aspXXXX so that the images are still displayed with , but it doesn't add the <a href="">... you know?</a>
<a href="">Hope this helps.
Mike</a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have a WP site which uses categories to display the same content in several locations. Which items should get a canonical tag to avoid a ding for duplicate content?
So...I have a Knowledge Center and press room that pretty much use the same posts. So...technically the content looks like its on several pages because the post shows up on the Category listing page. Do I add a Canonical tag to each individual post...so that it is the only one that is counted? Also...I have a LONG disclaimer that goes at the bottom of most of the posts. would this count as duplicate content? Is there a way to markup a single paragraph to tell the spiders not to crawl it?
Reporting & Analytics | | LindsayiHart0 -
Moz analytics showing joomla tag feature as duplicate page content
Moz Analytics is showing Joomla 3 tag pages as Duplicate Page Content because many articles are tagged with multiple words and therefore show up on the same tag-pages. example URL: www.domain.com/tag/tagID-tagname I already added "tag" as a URL parameter with Crawl=No URLs. Is there anything else I should do?
Reporting & Analytics | | modernmagic0 -
Analytics - content performance
Hi guys, is there any way to group specific pages together in analytics and just see how they have peroformed visit wise, etc? I need to group some resort guides all with different names, it's tedious going through each one manually. Help much appreciated as always!
Reporting & Analytics | | pauledwards0 -
Excluding referral traffic from a specific page Google analytics
Hi, I am trying to exclude from referrals from a particular page i.e. www.domain.com/nothispage within Google analytics, I have tried a couple variations within the advanced filter (Regex etc) section without much luck, could anyone assist ? Updated-trying to do this using a filter for the entire profile. Thanks Marc
Reporting & Analytics | | NRMA0 -
How To Determine ROI For Specific Keywords
I have been looking through our analytics, and nealy all our conversions are from (not provided) keywords. I am trying to find the keywords that are performing best and also to determine the return I am getting from SEO efforts. I have been told the "not provided" keywords are due to people being logged into Google when they browse. However, this doesn't make sense to me. How is it that only the keywords that convert are "not provided"? Can anyone help with this?
Reporting & Analytics | | inhouseseo0 -
Is Google able to determine duplicate content every day/ month?
A while ago I talked to somebody who used to work for MSN a couple of years ago within their engineering department. We talked about a recent dip we had with one of our sites.We argued this could be caused by the large amount of duplicate content we have on this particular website (+80% of our site). Then he said, quoted: "Google seems only to be able to determine every couple of months instead of every day if the content is actually duplicate content". I clearly don't doubt that duplicate content is a ranking factor. But I would like to know you guys opinions about Google being only able to determine this every couple of X months instead of everyday. Have you seen or heard something similar?
Reporting & Analytics | | Martijn_Scheijbeler0 -
Entrance keyword for specific landing page Google Analytics V5
Hi all I cant seem to find a way to get Entrance keywords for a specific landing page in version 5 of Google Analytics. Any quick tips? Thanks a million Fredrik
Reporting & Analytics | | Resultify0 -
What analytics program is best for a small business?
Of course Google analytics is good. I tried getclicky.com. Are there any other analytics programs that are better that are also affordable? What do you use?
Reporting & Analytics | | DallasBonsai0