Easy Question: regarding no index meta tag vs robot.txt
-
This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea?
I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best.
**DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them?
If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right?
ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option?
**301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think?
DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.
-
Hello Santaur,
I'm afraid this question isn't as easy as you may have thought at first. It really depends on what is on the pages in those two directories, what they're being used for, who visits them, etc... Certainly removing them altogether wouldn't be as terrible as some people might think IF those pages are of poor quality, have no external links, and very few - if any - visitors. It sounds to me that you might need a "Content Audit" wherein the entire site is crawled, using a tool like Screaming Frog, and then relevant metrics are pulled for those pages (e.g. Google Analytics visits, Moz Page Authority and external links...) so you can look at them and make informed decisions about which pages to improve, remove or leave as-is.
Any page that gets "removed" will leave you with another choice: Allow to 404/410 or 301 redirect. That decision should be easy to make on a page-by-page basis after the content audit because you will be able to see which ones have external links and/or visitors within the time period specified (e.g. 90 days). Pages that you have decided to "Remove" which have no external links and no visits in 90 days can probably just be deleted. The others can be 301 redirected to a more appropriate page, such as the blog home page, top level category page, similar page or - if all else fails - the site home page.
Of course any page that gets removed, whether it redirects or 404s/410s should have all internal links updated as soon as possible. The scan you did with Screaming Frog during the content audit will provide you with all internal links pointing to each URL, which should speed up that process for you considerably.
Good luck!
-
I would certainly think twice about removing those pages as they're in most cases of value for both your SEO as your users. If you would decide to go this way and to have them removed I would redirect all the pages belonging to these subdirectories to another page (let's say the homepage). Although you have a limited amount of traffic there you still want to make sure that the people who land on these pages get redirected to a page that does exist.
-
Are you sure you want to do this? You say 80% of the site consists of gallery and blog pages. You also say there are a lot of internal links to those pages. Are you perhaps under estimating the value of long- tail traffic
To answer your specific question, yes link juice will still pass thru to the pages that are no indexed. They just won't ever show up in search results. Using robots noindex gets you the same result. 301 redirects will pass all your link juice back to the home page, but makes for a lousy user experience. Same for deleting pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Role of Robots.txt and Search Console parameters settings
Hi, wondering if anyone can point me to resources or explain the difference between these two. If a site has url parameters disallowed in Robots.txt is it redundant to edit settings in Search Console parameters to anything other than "Let Googlebot Decide"?
Technical SEO | | LivDetrick0 -
Questions about canonicals
Howdy Moz community, I had a question regarding canonicals. I help a business with their SEO, and they are a service company. They have one physical location, but they serve multiple cities in the state. My question is in regards to canonicals and unique content. I hear that a page with slightly differing content for each page won't matter as much, if most of the content is relevantly the same. This business wants to create service pages for at least 10 other cities they service. The site currently only have pages that are targeting one city location. I was wondering if it was beneficial to use a template to service each city and then put a canonical there to say that it is an identical page to the main city page? Example: our first city was san francisco, we want to create city pages for santa rosa, novato, san jose and etc. If the content for the 2nd, 3rd, 4th, city were the same content as the 1st city, but just had the slight change with the city name would that hurt? Would putting a canonical help this issue, if i alert that it is the same as the 1st page? The reason I want to do this, is because I have been getting concerns from my copywriter that after the 5th city, they can't seem to make the services pages that much different from the first 4 cities, in terms of wording of the content and its structure. I want to know is there a simpler way to target multiple cities for local SEO reasons like geo targeted terms without having to think of a completely new way to write out the same thing for each city service page, as this is very time consuming on my end. Main questions? Will making template service pages, changing the city name to target different geographic locations and putting a canonical tag for the new pages created, and referring back to the main city page going to be effective in terms of me wanting to rank for multiple cities. Will doing this tell google my content is thin or be considered a duplicate? Will this hurt my rankings? Thanks!
Technical SEO | | Ideas-Money-Art0 -
Should I block Map pages with robots.txt?
Hello, I have a website that was started in 1999. On the website I have map pages for each of the offices listed on my site, for which there are about 120. Each of the 120 maps is in a whole separate html page. There is no content in the page other than the map. I know all of the offices love having the map pages so I don't want to remove the pages. So, my question is would these pages with no real content be hurting the rankings of the other pages on our site? Therefore, should I block the pages with my robots.txt? Would I also have to remove these pages (in webmaster tools?) from Google for blocking by robots.txt to really work? I appreciate your feedback, thanks!
Technical SEO | | imaginex0 -
Duplicate Title tags vs. View All for search results
I run a directory and some search queries give almost 1000 unique results. My moz campaign tells me that I have around 1,300 duplicate title tags etc. I read online about canonical, rel=next/prev, also about having a 'view all' page just for google (page links, not search queries), but if I do this, wouldn't the slowness mean google won't index it? So the question is what is the best thing to do?
Technical SEO | | tguide0 -
Updating Meta Tag error quickly besides submit to index in Webmaster Tools
For a conference page marketing built the meta tag didn't have correct year and date of the conference. I updated and used webmaster tools submit to index to try and get it updated in google search quickly but meta tag has not updated. Are there other avenues to get this corrected?
Technical SEO | | inhouseninja0 -
Meta-robots Nofollow
I don't understand Meta-robots Nofollow. Wordpress has my homepage set to this according to SEOMoz tool. Is this really bad?
Technical SEO | | hopkinspat1 -
Robots.txt file
How do i get Google to stop indexing my old pages and start indexing my new pages even months down the line? Do i need to install a Robots.txt file on each page?
Technical SEO | | gimes0 -
Quick Seo question regarding 301 redirect
Hi everyone and thank you for showing interested in my problem and for helping me out with this easy thing i have going on Here is how it puts out : I have 2 websites, same niche, mostly same keywords. Site #1 holding strong on google #2 ranking for months now. Site #2 was holding strong in google top 10 rankings until 2 weeks ago when it got sandboxed for some reason I want to use a 301 permanent redirect from Site #2 to Site #1 to pass all the link juice onto Site #1 and hopefully beat the #1 spot The question: Will this affect Site #1 is anyway, considering Site #2 is in somehow sandbox ( i assume that, since he dropped more then 70 positions over night ) Is thins a good think to do or i risk damaging Site #1 by doing this ? Thanks allot in advance. Best regards,
Technical SEO | | caw_ro
Trinca Alexandru0