Robots.txt & Disallow: /*? Question!
-
Hi,
I have a site where they have:
Disallow: /*?
Problem is we need the following indexed:
?utm_source=google_shopping
What would the best solution be? I have read:
User-agent: *
Allow: ?utm_source=google_shopping
Disallow: /*?Any ideas?
-
User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /archives/ Disallow: /? Allow: /comments/feed/ Disallow: /refer/ Disallow: /index.php Disallow: /wp-content/plugins/ Allow: /wp-admin/admin-ajax.php User-agent: Mediapartners-Google* Allow: / User-agent: Googlebot-Image Allow: /wp-content/uploads/ User-agent: Adsbot-Google Allow: / User-agent: Googlebot-Mobile Allow: / Sitemap: https://site.com/sitemap_index.xml
use this it will help you and your problem will solve
Regards
-
User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /archives/ Disallow: /? Allow: /comments/feed/ Disallow: /refer/ Disallow: /index.php Disallow: /wp-content/plugins/ Allow: /wp-admin/admin-ajax.php User-agent: Mediapartners-Google* Allow: / User-agent: Googlebot-Image Allow: /wp-content/uploads/ User-agent: Adsbot-Google Allow: / User-agent: Googlebot-Mobile Allow: / Sitemap: https://site.com/sitemap_index.xml
this will work ??
Regards
Sajad -
User-agent: * Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /archives/ Disallow: /*?* Allow: /comments/feed/ Disallow: /refer/ Disallow: /index.php Disallow: /wp-content/plugins/ Allow: /wp-admin/admin-ajax.php User-agent: Mediapartners-Google* Allow: / User-agent: Googlebot-Image Allow: /wp-content/uploads/ User-agent: Adsbot-Google Allow: / User-agent: Googlebot-Mobile Allow: / Sitemap: https://site.com/sitemap_index.xml use this it will help you Regards [Saad](https://clicktestworld.com/)
-
Hi Jeff,
Robots.txt tester as per the above link is definitely worth playing with and is the easiest route to achieving what you want.
Another reactive way of managing this is in some cases is to simply see the range of parameters Google has naturally crawled within Search Console.
You can see this in the old search console for now. So login and go to Crawl --> URL Parameters.
If Googlebot has encountered any ?=params it will list them. You'll then have an option how to manage them or exclude them from the index.
It can be a decent way of cleaning up a site with lot's of indexed pages (1,000+), although please be sure to read this documentation before using it: https://support.google.com/webmasters/answer/6080548?hl=en
-
With this kind of thing, it's really better to pick the specific parameters (or parameter combinations) which you'd like to exclude, e.g:
User-agent: *
Disallow: /shop/product/&size=*
Disallow: */shop/product/*?size=*
Disallow: /stockists?product=*
^ I just took the above from a robots.txt file which I have been working on, as these particular pages don't have 'pretty' URLs with unique content on. Very soon now that will change and the blocks will be lifted
If you are really 100% sure that there's only one param which you want to let through, then you'd go with:
User-agent: *
Disallow: /?
Allow: /?utm_source=google_shopping
Allow: /*&utm_source=google_shopping*
(or something pretty similar to that!)
Before you set anything live, get down a list of URLs which represent the blocks (and allows) which you want to achieve. Test it all with the Robots.txt tester (in Search Console) before you set anything live!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pagination & SEO
Hi We have automatically created brand pages based on which brand they have in their attributes. At the moment, developers have restricted the ability to properly optimise these for SEO, but I also wanted to look at how we should handle pagination. Example: http://www.key.co.uk/en/key/brand/manutan?page=1 http://www.key.co.uk/en/key/brand/manutan?page=2 http://www.key.co.uk/en/key/brand/manutan?page=3 Should we do any of the following - which I've found in an article: Put no follow on all links located on pagination pages Should we no index these pages as they are wasting crawl budget? - Don’t show links to page 2, 3, 4, 5… 10, 11, 12… at the end of your content but only a link to the next and previous pages so that you won’t dilute your page authority. Or does anyone else have any tips on how to handle these pages? Thank you!
Intermediate & Advanced SEO | | BeckyKey0 -
What does Disallow: /french-wines/?* actually do - robots.txt
Hello Mozzers - Just wondering what this robots.txt instruction means: Disallow: /french-wines/?* Does it stop Googlebot crawling and indexing URLs in that "French Wines" folder - specifically the URLs that include a question mark? Would it stop the crawling of deeper folders - e.g. /french-wines/rhone-region/ that include a question mark in their URL? I think this has been done to block URLs containing query strings. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Robots.txt Allowed
Hello all, We want to block something that has the following at the end: http://www.domain.com/category/product/some+demo+-text-+example--writing+here So I was wondering if doing: /*example--writing+here would work?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Client wants a seperate .tv domain for their media/videos instead of a subdomain/subfolder. What is the best way to pass of link equity to a new domain?
We have a client that wants to place their video content on a .tv tld instead of a subfolder/subdomain in their .com website. They believe that the .tv domain will better represent the media experience of their business. We can understand this client's position however we are concerned about their .tv domain will lose out on the link equity if it were no longer placed in the .com's subdomain/subfolder. Here are our questions: 1. What would be the best way to pass of link equity from .com website to a new .tv domain? Should we just have a video link on the .com website that 301 directs to the new .tv domain? 2. Is there any SEO benefit of having a .tv domain for Google Video queries or even Youtube? 3. Is there any long term value of having two different websites? For link equity purposes we understand that it would be better if everything was in a .com. However is a .tv domain ideal for a better representation of their media content? We appreciate any feedback.
Intermediate & Advanced SEO | | RosemaryB0 -
Google showing high volume of URLs blocked by robots.txt in in index-should we be concerned?
if we search site:domain.com vs www.domain.com, We see: 130,000 vs 15,000 results. When reviewing the site:domain.com results, we're finding that the majority of the URLs showing are blocked by robots.txt. They are subdomains that we use as production environments (and contain similar content as the rest of our site). And, we also find the message "In order to show you the most relevant results, we have omitted some entries very similar to the 541 already displayed." SEER Interactive mentions that this is one way to gauge a Panda penalty: http://www.seerinteractive.com/blog/100-panda-recovery-what-we-learned-to-identify-issues-get-your-traffic-back We were hit by Panda some time back--is this an issue we should address? Should we unblock the subdomains and add noindex, follow?
Intermediate & Advanced SEO | | nicole.healthline0 -
Automotive part / OEM / Manufacturer numbers
Hi All, What's the best way to optimise pages for OE / Manufacturer Part numbers? Disclaimer: All part numbers in this post are fictional. I dont want this post out ranking my client for real part numbers 🙂 Take this for Throttle Body for example: WOODYS S-AB-Q.123.53G This is the main part number from WOODYS (the manufacturer). However, these are all variations of exactly the same product: Woodys 2.78972.11.0 Woodys 2.78972.16.0 Woodys 2.78972.20.0 Woodys 2.78972.26.0 Oh, and car brands use OE numbers for these parts, such as: VWA 9808e40923G VWA 9808e40923L VWA 9808e40923M VWA 9808e40923P VWA 9808e40923Q These internal part numbers are vitally important as most of my clients customers are garages/mechanics so they're very likely to search on OE numbers. So, would you suggest: Optimising 10 different pages for the same product (using the part numbers in the URL, Title and H1). The problem is there's no unique content for these pages, only the part number varies, so this would likely get penalised for dupe content, or not enough unique content. Optimising one page for all terms. If so, how do you suggest doing this to ensure all part/OE numbers rank well and part numbers are prominent in the SERPS?
Intermediate & Advanced SEO | | seowoody
Could Schema.org help here by marking up these EO numbers with the isSimilarTo property of the Product type? I'm trying to ensure these part number get equal presence in the SERP snippet when searched for, even though I can't physically include all these numbers in the Title tag, URL and H1 of one page. 3. Something else? Thanks, Woody 🙂1 -
Penguin/Panda/Domain Purchase
If I move forward with the acquisition: 1. Should I, if there is a way, just acquire the domain and then attempt to unlink existing links? 2. Can I just buy the domain, completely kill the site, and then build again from scratch? Even if I do that, the links to the domain will still be out there. 3. Should I even move forward with the purchase if I know these tactics have been used? Thanks!
Intermediate & Advanced SEO | | dbuckles0 -
Posing QU's on Google Variables "aclk", "gclid" "cd", "/aclk" "/search", "/url" etc
I've been doing a bit of stats research prompted by read the recent ranking blog http://www.seomoz.org/blog/gettings-rankings-into-ga-using-custom-variables There are a few things that have come up in my research that I'd like to clear up. The below analysis has been done on my "conversions". 1/. What does "/aclk" mean in the Referrer URL? I have noticed a strong correlation between this and "gclid" in the landing page variable. Does it mean "ad click" ?? Although they seem to "closely" correlate they don't exactly, so when I have /aclk in the referrer Url MOSTLY I have gclid in the landing page URL. BUT not always, and the same applies vice versa. It's pretty vital that I know what is the best way to monitor adwords PPC, so what is the best variable to go on? - Currently I am using "gclid", but I have about 25% extra referral URL's with /aclk in that dont have "gclid" in - so am I underestimating my number of PPC conversions? 2/. The use of the variable "cd" is great, but it is not always present. I have noticed that 99% of my google "Referrer URL's" either start with:
Intermediate & Advanced SEO | | James77
/aclk - No cd value
/search - No cd value
/url - Always contains the cd variable. What do I make of this?? Thanks for the help in advance!0