Can I, in Google's good graces, check for Googlebot to turn on/off tracking parameters in URLs?
-
Basically, we use a number of parameters in our URLs for event tracking. Google could be crawling an infinite number of these URLs. I'm already using the canonical tag to point at the non-tracking versions of those URLs....that doesn't stop the crawling tho.
I want to know if I can do conditional 301s or just detect the user agent as a way to know when to NOT append those parameters.
Just trying to follow their guidelines about allowing bots to crawl w/out things like sessionID...but they don't tell you HOW to do this.
Thanks!
-
No problem Ashley!
It sounds like that would fall under cloaking, albeit pretty benign as far as cloaking goes. There's some more info here. The Matt Cutts video on that page has a lot of good information. Apparently any cloaking is against Google's guidelines. I would suspect you could get away with it, but I'd be worried everyday about a Google penalty getting handed down.
-
The syntax is correct. Assuming the site: and inurl: operators work in Bing, as they do in Google, then Bing is not indexing URLs with the parameters.
That article you've referred to only tells how to sniff out Google...one of a couple. What it doesn't tell me, unfortunately, is if there are any consequences of doing so and taking some kind of action...like shutting off the event tracking parameters in this case.
Just to be clear...thanks a bunch for helping out!
-
My sense from what you told me is that canonicals should be working in your case. What you're trying to use them for is what they're intended to do. You're sure the syntax is correct, and they're in the of the page or being set in the HTTP header?
Google does set it up so you can sniff out Googlebot and return different content (see here), but that would be unusual to do given the circumstances. I doubt you'd get penalized for cloaking for redirecting parameterized URLs to canonical ones for only Googlebot, but I'd still be nervous about doing it.
Just curious, is Bing respecting the canonicals?
-
Yeah, we can't noindex anything because there literally is NO way to crawl the site without picking up tracking parameters.
So we're saying that there is literally no good/approved way to say "oh look, it's google. let's make sure we don't put any of these params on the URL."? Is that the consensus?
-
If these duplicate pages have URLs that are appearing in search results, then the canonicals aren't working or Google just hasn't tried to reindex those pages yet. If the pages are duplicates, and you've set the canonical correctly, and entered them in Google Webmaster Tools, over time those pages should drop out of the index as Google reindexes them. You could try submitting a few of these URLs with parameters to Google to reindex manually in Google Webmaster Tools, and see if afterward they disappear from the results pages. If they do, then it's just a matter of waiting for Googlebot to find them all.
If that doesn't work, you could try something tricky like adding meta noindex tags to the pages with URL parameters, wait until they fall out of the index, and then add canonical tags back on, and see if those pages come back into the SERPs. If they do, then Google is ignoring your canonical tags. I hate to temporarily noindex any pages like this... but if they're all appearing separately in the SERPs anyhow, then they're not pooling their link juice properly anyway.
-
Thank you for your response. Even if I tell them that the parameters don't alter content, which I have, that doesn't stop how many pages google has to crawl. That's my main concern...that googlebot is spending too much time on these alternate URLs.
Plus there are millions of these param-laden URLs in the index, regardless of the canonical tag. There is currently no way for google to crawl the site without parameters that change constantly throughout each visit. This can't be optimal.
-
You're doing the right thing by adding canonicals to those pages. You can also go into Google Webmaster Tools and let them know that those URL parameters don't change the content of the pages. This really is the bread and butter of canonical tags. This is the problem they're supposed to solve.
I wouldn't sniff out Googlebot just to 301 those URLs with parameters to the canonical versions. The canonicals should be sufficient. If you do want to sniff out Googlebot, Google's directions are here. You don't do it by user agent, you do a reverse DNS lookup. Again, I would not do this in your case.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
At scale way to check content in google?
Is there any tools people know about where I can verify that Google is seeing all of our content at scale. I know I can take snippets and plug them into Google to see if we are showing up, but this is very time consuming and want to know across a bulk of pages.
Intermediate & Advanced SEO | | HashtagHustler0 -
Can Google bypass an AJAX link?
On my company's events calendar page when you click an event, it populates and overlay using AJAX, and then the link that is populated in that overlay then takes you to the actual events page. I see this as a problem with Google because it can't follow the AJAX link to the true event page, so right now nothing on those pages is getting indexed and we can't utilize our schema to get events to populate in the Google rich snippets or the knowledge graph. Possible solutions I considered: 1. Remove the AJAX overlay and allow the link from the events calendar to go directly to the individual event. 2. Leave the AJAX overlay and try to get the individual event pages directly indexed in Google. Thoughts and suggestions are greatly appreciated!
Intermediate & Advanced SEO | | MJTrevens0 -
Should I disallow all URL query strings/parameters in Robots.txt?
Webmaster Tools correctly identifies the query strings/parameters used in my URLs, but still reports duplicate title tags and meta descriptions for the original URL and the versions with parameters. For example, Webmaster Tools would report duplicates for the following URLs, despite it correctly identifying the "cat_id" and "kw" parameters: /Mulligan-Practitioner-CD-ROM
Intermediate & Advanced SEO | | jmorehouse
/Mulligan-Practitioner-CD-ROM?cat_id=87
/Mulligan-Practitioner-CD-ROM?kw=CROM Additionally, theses pages have self-referential canonical tags, so I would think I'd be covered, but I recently read that another Mozzer saw a great improvement after disallowing all query/parameter URLs, despite Webmaster Tools not reporting any errors. As I see it, I have two options: Manually tell Google that these parameters have no effect on page content via the URL Parameters section in Webmaster Tools (in case Google is unable to automatically detect this, and I am being penalized as a result). Add "Disallow: *?" to hide all query/parameter URLs from Google. My concern here is that most backlinks include the parameters, and in some cases these parameter URLs outrank the original. Any thoughts?0 -
CMS Pages - Multiple URLS (/)
Hi guys, this type of question has been asked a few times before but I couldn't find something that told me what i need so apologies if its a tad repetitive. I use Magento, and have several pages using its CMS. However, it produces 2 URLS for each page with a simple /. For example, website.com/hire
Intermediate & Advanced SEO | | ATP
website.com/hire/ I know google treats this as 2 separate pages, which would be the better solution. 1. Write a URL re-write for every CMS page
RewriteRule ^hire$ http://www.website.com/hire/ [R=301,L] (Is this right?) 2. Write a general rewrite rule to always add the /
No idea where to begin with this 3. Add a Canonical tag to the page which i think is possible in magento by adding this to the Custom Design Layout XML option in the page CMS. <action method="addLinkRel"></action> <rel>canonical</rel> <href>http://www.website.com/hire/</href> This would make the /hire/ page self-reference and the /hire page reference the /hire/ page I think. Which one of these solutions is the best and any pointers with the coding would be grand.0 -
Google Webmaster Remove URL Tool
Hi All, To keep this example simple.
Intermediate & Advanced SEO | | Mark_Ch
You have a home page. The home page links to 4 pages (P1, P2, P3, P4). ** Home page**
P1 P2 P3 P4 You now use Google Webmaster removal tool to remove P4 webpage and cache instance. 24 hours later you check and see P4 has completely disappeared. You now remove the link from the home page pointing to P4. My Question
Does Google now see only pages P1, P2 & P3 and therefore allocate link juice at a rate of 33.33% each. Regards Mark0 -
E-Commerce site - How do I geo-target towns/cities/states if there aren't any store locations?
Site = e-commerce Products = clothing (no apparel can be location specific like sports gear where you can do the location specific team gear (NBA, NFL, etc)) Problems = a. no store front b. I don't want to do any sitewides (footers, sidebars, etc) because of the penguin update Question = How do you geo-target these category pages and product pages? Ideas = a. reviews with clients locations b. blog posts with clients images wearing apparel and location description and keywords that also links back to that category or be it product page (images geo- targeted, tags, and description) c. ? Thanks in advance!
Intermediate & Advanced SEO | | Cyclone0 -
What on-page/site optimization techniques can I utilize to improve this site (http://www.paradisus.com/)?
I use a Search Engine Spider Simulator to analyze the homepage and I think my client is using black hat tactics such as cloaking. Am I right? Any recommendations on to improve the top navigation under Resorts pull down. Each of the 6 resorts listed are all part of the Paradisus brand, but each resort has their own sub domain.
Intermediate & Advanced SEO | | Melia0 -
How 'Off Topic' can I go - site wide?
Hello, I am currently number 1 for a competitive keyword - so don't want to push the wrong button and self destruct! My site is highly focused on one relatively narrow niche with about 50-60 pages of content bang on topic. I was wondering if Google will discredit my site in any way if I start adding pages that are** 'loosely related' **to the overall theme of my niche. Some of them are what you might call sister concepts with maybe one mention of my target keyword in the body..... Does the algo value what percentage of the whole site's content is on/ off topic? If so how important is this as a factor? Thanks a lot
Intermediate & Advanced SEO | | philipjterry0