Easy Question: regarding no index meta tag vs robot.txt
-
This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea?
I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best.
**DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them?
If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right?
ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option?
**301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think?
DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.
-
Hello Santaur,
I'm afraid this question isn't as easy as you may have thought at first. It really depends on what is on the pages in those two directories, what they're being used for, who visits them, etc... Certainly removing them altogether wouldn't be as terrible as some people might think IF those pages are of poor quality, have no external links, and very few - if any - visitors. It sounds to me that you might need a "Content Audit" wherein the entire site is crawled, using a tool like Screaming Frog, and then relevant metrics are pulled for those pages (e.g. Google Analytics visits, Moz Page Authority and external links...) so you can look at them and make informed decisions about which pages to improve, remove or leave as-is.
Any page that gets "removed" will leave you with another choice: Allow to 404/410 or 301 redirect. That decision should be easy to make on a page-by-page basis after the content audit because you will be able to see which ones have external links and/or visitors within the time period specified (e.g. 90 days). Pages that you have decided to "Remove" which have no external links and no visits in 90 days can probably just be deleted. The others can be 301 redirected to a more appropriate page, such as the blog home page, top level category page, similar page or - if all else fails - the site home page.
Of course any page that gets removed, whether it redirects or 404s/410s should have all internal links updated as soon as possible. The scan you did with Screaming Frog during the content audit will provide you with all internal links pointing to each URL, which should speed up that process for you considerably.
Good luck!
-
I would certainly think twice about removing those pages as they're in most cases of value for both your SEO as your users. If you would decide to go this way and to have them removed I would redirect all the pages belonging to these subdirectories to another page (let's say the homepage). Although you have a limited amount of traffic there you still want to make sure that the people who land on these pages get redirected to a page that does exist.
-
Are you sure you want to do this? You say 80% of the site consists of gallery and blog pages. You also say there are a lot of internal links to those pages. Are you perhaps under estimating the value of long- tail traffic
To answer your specific question, yes link juice will still pass thru to the pages that are no indexed. They just won't ever show up in search results. Using robots noindex gets you the same result. 301 redirects will pass all your link juice back to the home page, but makes for a lousy user experience. Same for deleting pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there any value in having a blank robots.txt file?
I've read an audit where the writer recommended creating and uploading a blank robots.txt file, there was no current file in place. Is there any merit in having a blank robots.txt file? What is the minimum you would include in a basic robots.txt file?
Technical SEO | | NicDale0 -
Do I need both canonical meta tags AND 301 redirects?
I implemented a 301 redirect set to the "www" version in the .htaccess (apache server) file and my logs are DOWN 30-40%! I have to be doing something wrong! AddType application/x-httpd-php .html .htm RewriteCond %{HTTP_HOST} ^luckygemstones.com
Technical SEO | | spkcp111
RewriteRule (.*) http://www.luckygemstones.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.htm
RewriteRule ^(.)index.htm$ http://www.luckygemstones.com/$1 [R=301,L] IndexIgnore *
ErrorDocument 404 http://www.luckygemstones.com/page-not-found.htm
ErrorDocument 500 http://www.luckygemstones.com/internal-serv-error.htm
ErrorDocument 403 http://www.luckygemstones.com/forbidden-request.htm
ErrorDocument 401 http://www.luckygemstones.com/not-authorized.htm I've also started adding canoncial META's to EACH page: I'm using HMTL 4.0 loose still--1000's of pages--painful to convert to HTML5 so I left the / off the tag so it would validate. Am I doing something wrong? Thanks, Kathleen0 -
Meta title Tag dilemma.... need help
Hey, Guys I have a dilemma that I cannot figure out how to solve. One thing that I have learned is that the meta tag is probably one of the most important factors of SEO. I work in the industry of real estate and we are located in a mid-sized market, Augusta, GA, which does not have a hugely competitive digital marketplace. So, I have told my web developer the changes that I want her to make to our major sub-domain pages on our website. I am anticipating that once she makes these changes which will allow me to make the necessary SEO changes to website, that we will see some good results. I have one dilemma that I can't figure out how to solve with the meta title tag. Check out our rental section: http://aubenrealty.com/rentals.cfm Now, click on any rental property and it will take you to that rental's page. Notice the page title " Auben Realty- real estate....." This is identical for every active and non-active property on our website. Every time we create a new property, this is what it spits out. Now, take it a step further and click on " Contact me about this property," and you will see the same page title. My dilemma is, " How do we fix this?" My assumption is that the best page title would be the address for each property( ex, 1322 Laurel Street, Augusta Ga 30904), right ? Is this some kind of simple coding adjustment?
Technical SEO | | AubbiefromAubenRealty0 -
Header Tag Question
While reviewing code on a site, I found the following: <h1 class="<a class="attribute-value">logo</a>"> <a id="<a class="attribute-value">logo</a>" href="[http://siteexampleh1.com](view-source:http://dmbinc.com/)"><span>Example of most important content on this page- Companyspan>a> h1> Is this the correct way to place code for an h1 tag? The content is cached within the page and is hidden to the viewer. The content that is assigned as the h1, is a logo. Majority of code I have been reviewing does not use this setup. The code would instead read as ( This is heading 1 ). Can anyone provide insights on this? Thanks!
Technical SEO | | jfeitlinger0 -
Block or remove pages using a robots.txt
I want to use robots.txt to prevent googlebot access the specific folder on the server, Please tell me if the syntax below is correct User-Agent: Googlebot Disallow: /folder/ I want to use robots.txt to prevent google image index the images of my website , Please tell me if the syntax below is correct User-agent: Googlebot-Image Disallow: /
Technical SEO | | semer0 -
Canonical Question
Our site has thousands of items, however using the old "Widgets" analogy we are unsure on how to implement the canonical tag, and if we need to at all. At the moment our main product pages lists all different "widget" products on one page, however the user can visit other sub pages that filter out the different versions of the product. I.e. glass widgets (20 products)
Technical SEO | | Corpsemerch
glass blue widgets (15 products)
glass red widgets (5 products)
etc.... I.e. plastic widgets (70 products)
plastic blue widgets (50 products)
plastic red widgets (20 products)
etc.... As the sub pages are repeating products from the main widgets page we added the canonical tag on the sub pages to refer to the main widget page. The thinking is that Google wont hit us with a penalty for duplicate content. As such the subpages shouldnt rank very well but the main page should gather any link juice from these subpages? Typically once we added the canonical tag it was coming up to the penguin update, lost a 20%-30% of our traffic and its difficult not to think it was the canonical tag dropping our subpages from the serps. Im tempted to remove the tag and return to how the site used to be repeating products on subpages.. not in a seo way but to help visitors drill down to what they want quickly. Any comments would be welcome..0 -
Pagination question
I have a website http://www.example.com with pagination series starting with page1.html upto page10.html. With backlinks to some of the pages ( page1.html, page2.html----page7.html). If i include rel="next" and rel="prev" on page1.html to page10.html pages. Will value of those links will be transfered to http://www.example.com This is what i interpret from http://bit.ly/mUOrn2 Am i right ?
Technical SEO | | seoug_20050 -
Sitemap question
My sitemap includes www.example.com and www.example.com/index.html, they are both the same page, will this have any negative effects, or can I remove the www.example.com/index.html?
Technical SEO | | Aftermath_SEO0