Ignore Urls with pattern.
-
I have 7000 warnings of urls because of a 302 redirect.
http://imageshack.us/photo/my-images/215/44060409.png/
I want to get rid of those, is it possible to get rid of the Urls with robots.txt.
For example that it does not crawl anything that has /product_compare/ in its url?
Thank you
-
in case they do not all start with /category
Disallow: product_compare
-
Then you simply add this to your robots.txt:
Disallow: /catalog/product_compare/
That should leave out all pages starting with:
https://www.theprinterdepo.com/catalog/product_compare/ -
-
Could you perhaps post a URL which has product_compare in it?
You could alter your robots.txt file to disallow robots to index pages in
http//www.domain.com/product_compare/ by adding this line to your robots.txt file: Disallow: /product_compare/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix Submitted URL marked ‘noindex’
Hi I recently discovered Google has stopped crawling/indexing my post.
Technical SEO | | Favplug
So i had to check my Search console then i saw this Coverage issues saying “Submitted URL marked ‘noindex’”. And anytime I tried Requesting Indexing For the affected pages, Its tells me “Indexing request rejected”. Here is my site URL: http://bit.ly/2kfqTEv Here is one of the affected pages http://bit.ly/39aMenJ0 -
Any idea why ?ref=wookmark being appended to URL?
We have a https site and have been checking our 301 re-directs from the old http pages. All seem fine except one...and it is ONLY weird in Firefox (it works OK on Chrome and IE). The http version of that one URL is redirecting to the correct https URL, but with ?ref=wookmark being appended to the end. Why? On the Firefox browser only... http://www.easydigging.com/broadfork(dot)html 301 redirects to https://www.easydigging.com/broadfork(dot)html?ref=wookmark From the research I did Wookmark seems to be a JQuery feature, but we do not use it (as far as I know). And even if we do, it probably should not pop up when doing a 301 redirect. I did try clearing my cache a few times, with no change in the problem. Any help is appreciated 🙂
Technical SEO | | GregB1230 -
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
International URL Structures
Hi everyone! I've read a bunch of articles on the topic, but I can't seem to be able to figure out a solution that works for the specific case. We are creating a site for a service agency, this agency has offices around the world - the site has a global version (in English/French & Spanish) and some country specific versions. Here is where it gets tricky: in some countries, each office has a different version of the site and since we have Canada for example we have a French and an English version of the site. For cost and maintenance reason, we want to have a single domain : www.example.com We want to be able to indicate via Search Console that each subdomain is attached to a different country, but how should we go about it. I've seen some examples with subfolders like this: Global FR : www.example.com/fr-GL Canada FR: www.example.com/fr-ca France: www.example.com/fr-fr Does this work? It seems to make more sense to use : **Subdirectories with gTLDs, **but I'm not sure how that would work to indicate the difference between my French Global version vs. France site. Global FR : www.example.com/fr France : www.example.com/fr/fr Am I going about this the right way, I feel the more I dig into the issue, the less it seems there is a good solution available to indicate to Google which version of my site is geo-targeted to each country. Thanks in advance!
Technical SEO | | sarahcoutu150 -
Language Specific Characters in URLs for
Hi People, would really appreciate your advice as we are debating best practice and advice seems very subjective depending if we are talking to our dev or SEO team. We are developing a website aimed at the South American market with content entirely in Spanish. This is our first international site so our experience is limited. Should we be using Spanish characters (such as www.xyz.com/contáctanos) in URLs or should we use ASCII character replacements? What are the pros and cons for SEO and usability? Would really be great to get advice from the Moz community and make me look good at the same time as it was my suggestion 🙂 Nick
Technical SEO | | nickspiteri0 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Keyword and URL
I have a client who has a popular name (like 'Joe Smith'). His blog URL has only his first name and the name of his company in it, like joe.company.com. His blog doesn't rank well at all in the first 3-4 Google SERPs. I was thinking of advising him to change the URL of his blog to joesmith.company.com, and having his webmaster do 301 redirects from the old URL to the new one. Do you think this is a good strategy, or would you recommend something else? I realize ranking isn't just about the URL, it's about links, etc. But I think making his URL more specific to his name could help. Any advice greatly appreciated! Jim
Technical SEO | | JamesAMartin0 -
Wordpress URL weirdness - why is google registering non-pretty URLS?
I've noticed in my stats that google is indexing some non-pretty URLs from my wordpress-based blog.
Technical SEO | | peterdbaron
For instance, this URL is appearing google search: http://www.admissionsquest.com/onboardingschools/index.php?p=439 It should be: http://www.admissionsquest.com/onboardingschools/2009/01/do-american-boarding-schools-face-growing-international-competition.html Last week I added the plugin Redirection in order to consolidate categories & tags. Any chance that this has something to do with it? Recs on how to solve this? Fyi - I've been using pretty URLS with wordpress from the very beginning and this is the first time that I've seen this issue. Thanks in advance for your help!0