How to Disallow Tag Pages With Robot.txt
-
Hi i have a site which i'm dealing with that has tag pages for instant -
http://www.domain.com/news/?tag=choice
How can i exclude these tag pages (about 20+ being crawled and indexed by the search engines with robot.txt
Also sometimes they're created dynamically so i want something which automatically excludes tage pages from being crawled and indexed.
Any suggestions?
Cheers,
Mark
-
Hi Nakul, its Drupal
Mark
-
What CMS is it Mark ?
-
Thanks, is there a way to test it out before actually implementing it with the site.
The site is non-wordpress aswell.
Cheers,
Mark
-
I agree. I would suggest adding the noindex on the pages and letting the bots crawl them. Blocking them would prevent future crawl of these pages, but I am guessing you would also want to remove the existing pages.
Therefore add the noindex first, wait a few days and then add the disallow (Although technically if they are noindex, you don't really need the disallow).
-
Hi Mark
If your using Wordpress then I would recommend SEO Yoast to resolve the tag issue. If not then I suggest you amend the robots.txt file to resolve.
Here is an example:
Disallow: /?tag=
Disallow: /?subcats=
Disallow: /*?features_hash=NOTE:
Be very careful when blocking search engines. Test and test again!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
One Page Design / Single Product Page
I have been working in a project. Create a framework for multi pages that I have So here is the case
Intermediate & Advanced SEO | | Roman-Delcarmen
Most of them are single page product / one page design wich means that I dont have many pages to optimize. All this sites/ pages follow the rules of a landing page optimization because my main goals is convert as many users as I can. At this point I need to optimize the SEO, the basic stuff such as header, descriptions, tittles ect. But most of my traffic is generated by affiliates, which is good beacuse I dont have to worrie to generate traffic but if the affiliate network banned my product, then I lose all my traffic. Put all my eggs in the same basket is not a good idea. Im not an seo guru so that is the reason Im asking whic strategies and tactics can give me results. All kind of ideas are welcome1 -
SEO: How to change page content + shift its original content to other page at the same time?
Hello, I want to replace the content of one page of our website (already indexeed) and shift its original content to another page. How can I do this without problems like penalizations etc? Current situation: Page A
Intermediate & Advanced SEO | | daimpa
URL: example.com/formula-1
Content: ContentPageA Desired situation: Page A
URL: example.com/formula-1
Content: NEW CONTENT! Page B
URL: example.com/formula-1-news
Content: ContentPageA (The content that was in Page A!) Content of the two pages will be about the same argument (& same keyword) but non-duplicate. The new content in page A is more optimized for search engines. How long will it take for the page to rank better?0 -
Large robots.txt file
We're looking at potentially creating a robots.txt with 1450 lines in it. This will remove 100k+ pages from the crawl that are all old pages (I know, the ideal would be to delete/noindex but not viable unfortunately) Now the issue i'm thinking is that a large robots.txt will either stop the robots.txt from being followed or will slow our crawl rate down. Does anybody have any experience with a robots.txt of that size?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Page Authority
Hi We have a large number of pages, all sitting within various categories. I am struggling to rank a level 3 for example, or increase authority of this page. Apart from putting it in the main menu or trying to build quality links to it, are there any other methods I can try? We have so many pages I find it hard to workout what the best way to internal link these pages for authority. At the moment they're classified in their relevant categories, but these go from level 1 down to 4 - is this too many classification levels?
Intermediate & Advanced SEO | | BeckyKey1 -
I've seen and heard alot about city-specific landing pages for businesses with multiple locations, but what about city-specific landing pages for cities nearby that you aren't actually located in? Is it ok to create landing pages for nearby cities?
I asked here https://www.google.com/moderator/#7/e=adbf4 but figured out ask the Moz Community also! Is it actually best practice to create landing pages for nearby cities if you don't have an actual address there? Even if your target customers are there? For example, If I am in Miami, but have a lot of customers who come from nearby cities like Fort Lauderdale is it okay to create those LP's? I've heard this described as best practice, but I'm beginning to question whether Google sees it that way.
Intermediate & Advanced SEO | | RickyShockley2 -
Product Tag Pages - Shopify
My website is Sportiqe.com. We sell t-shirts and use Shopify. We're finding that Google is assigning a higher than normal (normal being "1") page authority ranking on our product tag pages (ie - Products Tagged "knicks"). Would it make sense to do 301 redirects for these product tag pages to the Product pages we want to rank for? (ie - would we do a 301 redirect for a page called "Products Tagged 'Knicks'" to our "New York Knicks Shirts" page?) OR Would it make sense to change these Product Tag Page titles to another key term to have multiple search results (assuming that ordering the products in a different way would eliminate any Duplicate Page Content issues?) For example, renaming the page title from "Products Tagged Knicks" to "TAG NAME | Sportiqe Apparel" Appreciate any insight from the Moz community, Shopify store managers and fellow t-shirt enthusiasts.
Intermediate & Advanced SEO | | farmiloe0 -
Meta Tag Force Page Refresh - Good or Bad?
I had recently come across a meta tag that could cause a auto refresh on a users browser when implemented. I have been using it for a redesign and was curious if there could be any negative effects for using it, here is the code: All input is appreciated. Ciao, Todd Richard
Intermediate & Advanced SEO | | RichFinnSEO0 -
Rel Alternate tag and canonical tag implementation question
Hello, I have a question about the correct way to implement the canoncial and alternate tags for a site supporting multiple languages and markets. Here's our setup. We have 3 sites, each serving a specific region, and each available in 3 languages. www.example.com : serves the US, default language is English www.example.ca : serves Canada, default language is English www.example.com.mx : serves Mexico, default language is Spanish In addition, each sites can be viewed in English, French or Spanish, by adding a language specific sub-directory prefix ( /fr , /en, /es). The implementation of the alternate tag is fairly straightforward. For the homepage, on www.example.com, it would be: -MX” href=“http://www.example.com.mx/index.html” /> -MX” href=”http://www.example.com.mx/fr/index.html“ />
Intermediate & Advanced SEO | | Amiee
-MX” href=”http://www.example.com.mx/en/index.html“ />
-US” href=”http://www.example.com/fr/index.html” />
-US” href=”http://www.example.com/es/index.html“ />
-CA” href=”http://www.example.ca/fr/index.html” />
-CA” href=”http://www.example.ca/index.html” />
-CA” href=”http://www.example.ca/es/index.html” /> My question is about the implementation of the canonical tag. Currently, each domain has its own canonical tag, as follows: rel="canonical" href="http://www.example.com/index.html"> <link rel="canonical" href="http: www.example.ca="" index.html"=""></link rel="canonical" href="http:>
<link rel="canonical" href="http: www.example.com.mx="" index.html"=""></link rel="canonical" href="http:> I am now wondering is I should set the canonical tag for all my domains to: <link rel="canonical" href="http: www.example.com="" index.html"=""></link rel="canonical" href="http:> This is what seems to be suggested on this example from the Google help center. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077 What do you think?0