Is User Agent Detection still a valid method for blocking certain URL parameters from the Search Engines?
-
I'm concerned with the cloaking issue. Has anyone successfully implemented user agent detection to provide the Search engines with "clean" URLs?
-
I would not risk it, wouild be better to block in robots but i donrt really like that idea much either. A no index, follow tag is better of you can manage it.
I have not seen your urls or know the reason why you have the problem, but it is best of cause to avoid the problem in the first place.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with parameter URLs as primary internal links and not canonicals? Weird situation inside...
So I have a weird situation, and I was hoping someone could help. This is for an ecommerce site. 1. Parameters are used to tie Product Detail Pages (PDP) to individual categories. This is represented in the breadcrumbs for the page and the use of a categoryid. One product can thus be included in multiple categories. 2. All of these PDPs have a canonical that does not include the parameter / categoryid. 3. With very few exceptions, the canonical URL for the PDPs are not linked to. Instead, the parameter URL is to tie it to a specific category. This is done primarily for the sake of breadcrumbs it seems. One of the big issues we've been having is the canonical URLs not being indexed for a lot of the products. In some instances, the canonicals _are _indexed alongside parameters, or just parameter URLs are indexed. It's all very...mixed up, I suppose. My theory is that the majority of canonical URLs not being linked to anywhere on the site is forcing Google to put preference on the internal link instead. My problem? **I have no idea what to recommend to the client (who will not change the parameter setup). ** One of our Technical SEOs recommended we "Use cookies instead of parameters to assign breadcrumbs based on how the PDP is accessed." I have no experience this. So....yeah. Any thoughts? Suggestions? Thanks in advance.
Intermediate & Advanced SEO | | Alces0 -
Is Link equity / Link Juice lost to a blocked URL in the same way that it is lost to nofollow link
Hi If there is a link on a page that goes to a URL that is blocked in robots txt - is the link juice lost in the same way as when you add nofollow to a link on a page. Any help would be most appreciated.
Intermediate & Advanced SEO | | Andrew-SEO0 -
Website using search term as URL brand name to cheat Google
Google has come a long way over the past 5 years, the quality updates have really helped bring top quality content to the top that is relevant for users search terms, although there is one really ANNOYING thing that still has not been fixed. Websites using brand name as service search term to manipulate Google I have got a real example but I wouldn't like to use it in case the brand mentions flags up in their tools and they spot this post, but take this search for example "Service+Location" You will get 'service+location.com' rank #1 Why? Heaven knows. They have less than 100 backlinks which are of a very low, spammy quality from directories. The content is poor compared to the competition and the competitors have amazing link profiles, great social engagement, much better website user experience and the data does not prove anything. All the competitors are targeting the same search term but yet the worst site is ranking the highest. Why on earth is Google not fixing this issue. This page we are seeing rank #1 do not even deserve to be ranking on the first 5 pages.
Intermediate & Advanced SEO | | Jseddon920 -
Url structure of a blog
We are trying to work out what the best structure for our blog is as we want each page to rank as highly as possible, we were looking at a flat structure similar to http://www.hongkiat.com/blog/ where every posts is after the blog/ but not in category's although the viewers can look in different category's from the top buttons on the page- photoshop - icons etc or we where going to go for the structured way- blog/photoshop/blog-post.html the only problem is that we will end up 4 deep at least with this and at least 80 characters in the url. any help would be appreciated. Thanks Shaun
Intermediate & Advanced SEO | | BobAnderson0 -
HTTPS Certificate Expired. Website with https urls now still in index issue.
Hi Guys This week the Security certificate of our website expired and basically we now have to wail till next Tuesday for it to be re-instated. So now obviously our website is now index with the https urls, and we had to drop the https from our site, so that people will not be faced with a security risk screen, which most browsers give you, to ask if you are sure that you want to visit the site, because it's seeing it as an untrusted one. So now we are basically sitting with the site urls, only being www... My question what should we do, in order to prevent google from penalizing us, since obviously if googlebot comes to crawl these urls, there will be nothing. I did however re-submitted it to Google to crawl it, but I guess it's going to take time, before Google picks up that now only want the www urls in the index. Can somebody please give me some advice on this. Thanks Dave
Intermediate & Advanced SEO | | daveza0 -
Blocking out specific URLs with robots.txt
I've been trying to block out a few URLs using robots.txt, but I can't seem to get the specific one I'm trying to block. Here is an example. I'm trying to block something.com/cats but not block something.com/cats-and-dogs It seems if it setup my robots.txt as so.. Disallow: /cats It's blocking both urls. When I crawl the site with screaming flog, that Disallow is causing both urls to be blocked. How can I set up my robots.txt to specifically block /cats? I thought it was by doing it the way I was, but that doesn't seem to solve it. Any help is much appreciated, thanks in advance.
Intermediate & Advanced SEO | | Whebb0 -
What Should I Do With My URL Names?
I release property on my blog each week, and it has come to the point we will get property in the same area as we have had in the past. So, I name my URL /blah-blah-blah-[area of property]/ for the first property in that area right. Now I get a different property in that same area and the URL will have to be named /blah-blah-blah-[area of property]-2/. Now I'm not sure if this is a major issue or not, but I'm sure there must be a better way than this, and I don't really want to take down our past properties - unless you can give me good reason too, of course? So before I start getting URLs like this: /blah-blah-blah-[area of property]-2334343534654/ (well, ok, maybe not that bad! But you get my point) I wanted to see what everyones opinion on it is 🙂 Thanks in advance!
Intermediate & Advanced SEO | | JonathanRolande0 -
Duplicate content via dynamic URLs where difference is only parameter order?
I have a question about the order of parameters in an URL versus duplicate content issues. The URLs would be identical if the parameter order was the same. E.g.
Intermediate & Advanced SEO | | anthematic
www.example.com/page.php?color=red&size=large&gender=male versus
www.example.com/page.php?gender=male&size=large&color=red How smart is Google at consolidating these, and do these consolidated pages incur any penalty (is their combined “weight” equal to their individual selves)? Does Google really see these two pages as DISTINCT, or does it recognize that they are the same because they have the exact same parameters? Is this worth fixing in or does it have a trivial impact? If we have to fix it and can't change our CMS, should we set a preferred, canonical order for these URLs or 301 redirect from one version to the other? Thanks a million!0