What is the best way to eliminate this specific image low lying content?
-
The site in question is www.homeanddesign.com where we are working on recovering from some big traffic loss.
I finally have gotten the sites articles properly meta titled and descriptioned now I'm working on removing low lying content.
The way there CMS is built, images have their own page (every one that's clickable). So this leads to a lot of thin content that I think needs to be removed from the index. Here is an example:
http://www.homeanddesign.com/photodisplay.asp?id=3633
I'm considering the best way to remove it from the index but not disturb how users enjoy the site.
What are my options? Here is what I'm thinking:
-
add Disallow: /photodisplay to the robots.txt file
-
See if there is a way to make a lightbox instead of a whole new page for images. But this still leaves me with 100s of pages with just an image on there with backlinks, etc.
-
Add noindex tag to the photodisplay pages
-
-
Disallow: /photodisplay.asp?*
That should do it. But just to be safe you can add another one for:
Disallow: /photodisplay.asp
There is very, very, very, very little danger of you blocking your entire site from being crawled if you add those disallow statements to your robots.txt file. If you're an SEO your job is to "mess with" the robots.txt file. Furthermore, trying to dynamically change the robots meta tag to noindex based on page-type is going to be much more tricky and potentially dangerous than adding a line to the robots.txt file.
Don't forget to remove the pages from the index using the URL removal tool in GWT once the block has been added.
Also I'd stop linking to those pages. It is best practice not to link to pages that you don't want indexed if you can help it. I'd go the lightbox route you mentioned above. This is something I do on my Wordpress sites too.
Good luck!
-
Hi WIlliam,
I would personally go the route of adding the noindex tag to the photo pages. Messing with the robots.txt file would probably be quicker; however, I am a little hesitant about messing with the robots.txt tag if I don't have to... one slip and you could be blocking your whole site or an entire directory from being crawled vs specifically calling out each individual page using the noindex tag.
Lightboxes are fine, but like you say, you aren't really solving the problem of tons of other pages.
You could look into your CMS and see if there is a way to remove the automatically generated link to photodisplay.aspXXXX so that the images are still displayed with , but it doesn't add the <a href="">... you know?</a>
<a href="">Hope this helps.
Mike</a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
PDF best practices: to get them indexed or not? Do they pass SEO value to the site?
All PDFs have landing pages, and the pages are already indexed. If we allow the PDFs to get indexed, then they'd be downloadable directly from google's results page and we would not get GA events. The PDFs info would somewhat overlap with the landing pages info. Also, if we ever need to move content, we'd now have to redirects the links to the PDFs. What are best practices in this area? To index or not? What do you / your clients do and why? Would a PDF indexed by google and downloaded directly via a link in the SER page pass SEO juice to the domain? What if it's on a subdomain, like when hosted by Pardot? (www1.example.com)
Reporting & Analytics | | hlwebdev1 -
Is there an easy way to switch hundreds of websites to https in GSC?
My company has hundreds of websites setup in Google Search Console but will soon be moving them all to secure domains. Is there an easy way to make the switch in GSC or do we have to change the address one by one?
Reporting & Analytics | | MJTrevens0 -
How to set goal in Google Analytics that required specific page
So our company has new page that has just implemented (let say "page x" --> not a landing page) and we want to see how many visitors that through "page x " convert into the goal (let say "page y"). If I just make the goal destination like "/page y" the goal number that appear is ALL the visitors who reach "page y" (through or not through "page x"), so how I set the goal setting to only show the visitors who reach "page y" through "page x" ? Thank you
Reporting & Analytics | | ddspg0 -
How often does google content experiments stats update?
From my experience it seems to update once per day (every 24 hours), can anyone confirm this is the case or have a link to an official announcement which confirms how often the data updates? It would be handy to know when it updates so we can see the latest information as it comes in.
Reporting & Analytics | | Twist3600 -
Duplicate page content
I'm seeing duplicate page content for tagged URLs. For example:
Reporting & Analytics | | DolbySEO
http://www.dolby.com/us/en/about-us/careers/landing.html
http://www.dolby.com/us/en/about-us/careers/landing.html?onlnk=al-sc as well as PPC campaigns. We tag certain landing pages purposefully in order to understand that traffic comes from these pages, since we use Google Analytics and don't have the abiility to see clickpaths in the package we have. Is there a way to set parameters for crawling to exclude certain pages or tagged content, such as those set up for PPC campaigns?0 -
Ways to analyze a 1M rows dataset of search queries
Hi, I have this large dataset, about 1 million search queries with visits, bounce rate and a few other metrics. I'm trying to explore this data to find keyword "buckets" (such as include product name, location name, transactional objective, informational, etc.), as well as explore the density of certain keywords (keywords as in instances of a single word amongst all queries) My idea was to use Excel and a macro to split all queries in separate words (also clearing punctuation and uppercase/lowercase), then storing this word in a new worksheet, adding to another column the visit counts from the row where the word was extracted (as to give a sense of weight). Before adding the word to the new worksheet, the script will look if the word already existed, if so it would just add the current value of visits to the existing visit counts etc. In the end it will create sort of a "dictionary" of all the keywords in all search queries ranked by weight (= visits from search query including this keyword) This would help me get started I believe, because I can't segment and analyze 1M raw search queries... My issue is: this VBA has been running on my (fast) PC for the last 24hr and it doesn't seem to get to an end. Obviously excel+VBA is not the best way to do text mining and manipulation in such a large dataset (although it's just a 30mb file) What would you do if you had this dataset and would like to mine the text/semantic as I am doing? Any idea of tools? process? I'm considering dumping this data into a MySQL db and doing the processing through PHP (the only backend language I'm versed in), and getting the "summified" data stored into another table, which I'll then be able to export to a Excel for analysis. But I'm afraid that I'll be facing memory limit issues and such... In the meantime, I'm definitely interested into knowing what you guys would do if you had this data and wanted to simply start exploring its constituencies Thanks!
Reporting & Analytics | | briacg0 -
Best practice SEO/SEM/Analaytics/Social reports
Hi All, does anyone have a best practice excel spreadsheet of a internal report we should be using.... ie what are the main factors we should be tracking? Unqiue views? time spent on site? Where they came from? seo/sem/network/direct to site? social media tracking? amount of +1/fb likes/tweets etc thanks
Reporting & Analytics | | Tradingpost0