Ignore Urls with pattern.
-
I have 7000 warnings of urls because of a 302 redirect.
http://imageshack.us/photo/my-images/215/44060409.png/
I want to get rid of those, is it possible to get rid of the Urls with robots.txt.
For example that it does not crawl anything that has /product_compare/ in its url?
Thank you
-
in case they do not all start with /category
Disallow: product_compare
-
Then you simply add this to your robots.txt:
Disallow: /catalog/product_compare/
That should leave out all pages starting with:
https://www.theprinterdepo.com/catalog/product_compare/ -
-
Could you perhaps post a URL which has product_compare in it?
You could alter your robots.txt file to disallow robots to index pages in
http//www.domain.com/product_compare/ by adding this line to your robots.txt file: Disallow: /product_compare/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Shold I Structure URLs for a Portfolio?
Hi Moz Community, My web design agency has a lot of different projects we showcase in the portfolio of our site, but I'm having trouble finding information on the best practices for how to structure the URLs for all of those portfolio pages. We have tons of projects that we've done in the same service category and even multiple projects we've done for the same company within that category. For example, right now things look like: www.rootdomain.com/portfolio/web-design/clientname which tends to get long, bulky and awkward, considering we do lots of projects in the web design category and might do a second project for the same company. How should we differentiate the projects from a URL standpoint to avoid having all of the pages compete for the same keyword? Does it even matter, given that these portfolio showcases are primarily image-based anyways?
Technical SEO | | formandfunctionagency0 -
Dynamic Url best approach
Hi We are currently in the process of making changes to our travel site where by if someone does a search this information can be stored and also if the user needs to can take the URL and paste into their browser at find that search again. The url will be dynamic for every search, so in order to stop duplicate content I wanted ask what would be the best approach to create the URLS. ** An example of the URL is: ** package-search/holidays/hotelFilters/?depart=LGW&arrival=BJV&sdate=20150812&edate=20150819&adult=2&child=0&infant=0&fsearch=first&directf=false&nights=7&tsdate=&rooms=1&r1a=2&r1c=0&r1i=0&&dest=3&desid=1&rating=&htype=all&btype=all&filter=no&page=1 I wanted to know if people have previous experience in something like this and what would be the best option for SEO. Will we need to create the URL with a # ( As i read this stops google crawling after the #) Block the folder IN ROBOTS is there any other areas I should be aware of in order stop any duplicate content and 404 pages once the URL/HOLIDAY SEARCH is no longer valid. thanks E
Technical SEO | | Direct_Ram0 -
URL path randomly changing
Hi eveyone, got a quick question about URL structures: I'm currently working in ecommerce with a site that has hundreds of products that can be accessed through different URL paths: 1)www.domain.com/productx 2)www.domain.com/category/productx 3)www.domain.com/category/subcategory/productx 4)www.domain.com/bestsellers/productx 5)... In order to get rid of dublicate content issues, the canoncial tag has been installed on all the pages required. The problem I'm witnessing now is the following: If a visitor comes to the site and navigates to the product through example 2) at time the URL shown in the URL browser box is example 4), sometimes example 1) or whatever. So it is constantly changing. Does anyone know, why this happens and if it has any impact on GA tracking or even on SEO peformance. Any reply is much appreciated Thanks you
Technical SEO | | ennovators0 -
Canonical URLs on location based offers
hello world. i offer first aid courses in different locations in switzerland. now i'm not sure if i have to make the single registration pages rel="canonical" or not. example:
Technical SEO | | alekaj
location 1 -> course list @ Nothelferkurse Thun
location 2 -> course list @ Nothelferkurse Bern the content is almost the same. how do i have to handle with it? thanks for your help!2 -
How important is keyword usage in the URL?
Hi,
Technical SEO | | Whebb
We have a client who has engaged us recently for some SEO work and most of their website looks pretty good seo-wise already. Many of their site pages rank at the top or middle of page two for their targeted keywords. In many cases they are not using the targeted keyword in the URL and most pages could use some additional on-page clean up. My question is, is it worth it to re-write the URLs to include the targeted keyword and then do 301 redirects to send the old pages to the new ones in order to improve the ranking? Or should we just do the minor on page work in hopes that this will be enough to improve the rankings and push them on to the first page. Thanks.0 -
Rogue url foung in webmaster toos
Buon Giorno from 2 degrees C thick fog wetherby UK 🙂 On this site www.davidclick.com I ran a crawl test and came across a url that doesnt exist in my site, the findings are illustrated here: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/rogue-urlcopy_zps6c58ee46.jpg The plot thickens... the source of the referring traffic to a page that doesnt exist can be seen here: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/rogue-link-source_zpsc70a34fc.jpg My intial thoughts rae to disavow via this tool:
Technical SEO | | Nightwing
https://www.google.com/webmasters/tools/disavow-links-main So my question is please: Is this sinister or should I just sit back drink a cup of horlicks and return to a Zen like status of inner peace? Any insights welcome 😉 Grazie tanto, David0 -
How to handle temporary campaign URLs
Hi, We have just run a yearly returning commercial campaign for which we have created optimized URL's. (e.g. www.domain.tld/campaign including the category and brand names after the campaign www.domain.tld./campaign/womens This has resulted in 4500+ URL's being indexed in Google including the campaign name, now the campaign is over and these URL's do not exist anymore. How should we handle those URL's? 1.) 301 them to the correct category without the campaign name 2.) Create a static page www.domain.tld/campaign to which we 301 all URL's that have the campaign name in them Do you have any other suggestions on what the best approach would be? This is a yearly commercial campaign so in a year time we will have the same URL's again. Thanks, Chris
Technical SEO | | eCommerceSEO0 -
Basic URL Structure Question
Hi, Putting together a URL for a product we are selling. We sell IT Training courses and the structure is normally Top Folder=Main Courses section Sub Folder=Vendor Page Specific=Course Name + Term An example is courses/microsoft/mcse-training However I have a product where the vendor and course name are the same. How should I best organise the URL - double mention or single mention So a) courses/togaf/togaf-foundation-training or b) courses/togaf/foundation-training
Technical SEO | | RobertChapman0