Easy Question: regarding no index meta tag vs robot.txt
-
This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea?
I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best.
**DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them?
If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right?
ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option?
**301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think?
DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.
-
Hello Santaur,
I'm afraid this question isn't as easy as you may have thought at first. It really depends on what is on the pages in those two directories, what they're being used for, who visits them, etc... Certainly removing them altogether wouldn't be as terrible as some people might think IF those pages are of poor quality, have no external links, and very few - if any - visitors. It sounds to me that you might need a "Content Audit" wherein the entire site is crawled, using a tool like Screaming Frog, and then relevant metrics are pulled for those pages (e.g. Google Analytics visits, Moz Page Authority and external links...) so you can look at them and make informed decisions about which pages to improve, remove or leave as-is.
Any page that gets "removed" will leave you with another choice: Allow to 404/410 or 301 redirect. That decision should be easy to make on a page-by-page basis after the content audit because you will be able to see which ones have external links and/or visitors within the time period specified (e.g. 90 days). Pages that you have decided to "Remove" which have no external links and no visits in 90 days can probably just be deleted. The others can be 301 redirected to a more appropriate page, such as the blog home page, top level category page, similar page or - if all else fails - the site home page.
Of course any page that gets removed, whether it redirects or 404s/410s should have all internal links updated as soon as possible. The scan you did with Screaming Frog during the content audit will provide you with all internal links pointing to each URL, which should speed up that process for you considerably.
Good luck!
-
I would certainly think twice about removing those pages as they're in most cases of value for both your SEO as your users. If you would decide to go this way and to have them removed I would redirect all the pages belonging to these subdirectories to another page (let's say the homepage). Although you have a limited amount of traffic there you still want to make sure that the people who land on these pages get redirected to a page that does exist.
-
Are you sure you want to do this? You say 80% of the site consists of gallery and blog pages. You also say there are a lot of internal links to those pages. Are you perhaps under estimating the value of long- tail traffic
To answer your specific question, yes link juice will still pass thru to the pages that are no indexed. They just won't ever show up in search results. Using robots noindex gets you the same result. 301 redirects will pass all your link juice back to the home page, but makes for a lousy user experience. Same for deleting pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Product Tags
Opencart allows the use of product tags (please note, these are NOT meta tags) which I believe are used for when customers want to search for a product using the search function. So one of my tags could be ''star wars socks'', and when a customer types this into the search it brings up every product containing the tag for socks. This is all good and well, however, these tags appear on the product page itself, right below the Manufacturer/Brand, and above the price (they created pages but I have canonical links in them so that is a non-issue). Will Google look kindly on this or could it be considered as keyword stuffing? Or will Google know they're for search and ignore them? I just need to know whether or not removing them entirely will be a good or bad idea.
Technical SEO | | moon-boots0 -
Homepage not indexed
Hi, I have a problem with my website. From my PC, when I search for site:nobelcom.com the homepage of the website doesn't appear, but on other PCs (different IPs) it is ok.
Technical SEO | | Silviu
Also any keywords that usually responded with homepage, now responds with other page. Does anyone know way this is happening. It happen before the Penguin update, and after a fetch like google and send to index, I had the homepage back on serps0 -
Good robots txt for magento
Dear Communtiy, I am trying to improve the SEO ratings for my website www.rijwielcashencarry.nl (magento). My next step will be implementing robots txt to exclude some crawling pages.
Technical SEO | | rijwielcashencarry040
Does anybody have a good magento robots txt for me? And what need i copy exactly? Thanks everybody! Greetings, Bob0 -
Exclude root url in robots.txt ?
Hi, I have the following setup: www.example.com/nl
Technical SEO | | mikehenze
www.example.com/de
www.example.com/uk
etc
www.example.com is 301'ed to www.example.com/nl But now www.example.com is ranking instead of www.example.com/nl
Should is block www.example.com in robots.txt so only the subfolders are being ranked?
Or will i lose my ranking by doing this.0 -
No Access to change duplicate product title and meta tags!?
I have a client who's website contains a php file to dynamically call a product xml file from an external source. I asked the web dev company if there was any way to access or change the titles and meta descriptions to be unique for each product and they said no, not with their system. With about 63 product pages is this going to hurt me trying to get him ranked locally? What is the best to handle a situation like this?
Technical SEO | | satoridesign0 -
Bing indexing
Hello, people~ I want to discuss about Bing indexation. I have a new web site which opened about 3 months ago. Google has no problem to index my site and all pages within the site indexed by Google. However, Bing and Yahoo is different story. I used manual submission, Bing webmaster tool to let Bing know about the site. However, Bing is not indexing my site yet. I researched about it and found that my site should have some external links before I get index by Bing. I check external links of my site with Google webmaster tool, SEOmoz tool and "link:" on Google. All tools show different number as below. Google webmaster Tool : more than 50 SEMoz site explorer : 5 link: on Google: none Why all method of checking links are different and which on should most depend on? Also how many links should I have in order to get index by Bing? Could you people please share your opinion?
Technical SEO | | Artience0 -
Redirect question
I would like to redirect http://example.com/index.html to http://www.example.com/ Is the code below correct ? RewriteEngine on RewriteCond %{HTTP_HOST}^example.comRewriteRule (.*) http://www.example.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.html\ HTTP/ RewriteRule ^index.html$ http://www.example.com/ [R=301,L]
Technical SEO | | seoug_20050