Is User Agent Detection still a valid method for blocking certain URL parameters from the Search Engines?
-
I'm concerned with the cloaking issue. Has anyone successfully implemented user agent detection to provide the Search engines with "clean" URLs?
-
I would not risk it, wouild be better to block in robots but i donrt really like that idea much either. A no index, follow tag is better of you can manage it.
I have not seen your urls or know the reason why you have the problem, but it is best of cause to avoid the problem in the first place.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO effect of URL with subfolder versus parameters?
I'll make this quick and simple. Let's say you have a business located in several cities. You've built individual pages for each city (linked to from a master list of your locations). For SEO purposes is it better to have the URL be a subfolder, or a parameter off of the home page URL: https://www.mysite.com/dallas which is essentially https://www.mysite.com/dallas/index.php or http://www.mysite.com/?city=dallas which is essentially https://www.mysite.com/index.php?city=dallas
Intermediate & Advanced SEO | | Searchout0 -
URL Parameters as a single solution vs Canonical tags
Hi all, We are running a classifieds platform in Spain (mercadonline.es) that has a lot of duplicate content. The majority of our duplicate content consists of URL's that contain site parameters. In other words, they are the result of multiple pages within the same subcategory, that are sorted by different field names like price and type of ad. I believe if I assign the correct group of url's to each parameter in Google webmastertools then a lot these duplicate issues will be resolved. Still a few questions remain: Once I set f.ex. the 'page' parameter and i choose 'paginates' as a behaviour, will I let Googlebot decide whether to index these pages or do i set them to 'no'? Since I told Google Webmaster what type of URL's contain this parameter, it will know that these are relevant pages, yet not always completely different in content. Other url's that contain 'sortby' don't differ in content at all so i set these to 'sorting' as behaviour and set them to 'no' for google crawling. What parameter can I use to assign this to 'search' I.e. the parameter that causes the URL's to contain an internal search string. Since this search parameter changes all the time depending on the user input, how can I choose the best one. I think I need 'specifies'? Do I still need to assign canonical tags for all of these url's after this process or is setting parameters in my case an alternative solution to this problem? I can send examples of the duplicates. But most of them contain 'page', 'descending' 'sort by' etc values. Thank you for your help. Ivor
Intermediate & Advanced SEO | | ivordg0 -
CHange insite Urls structure
Hello Guys! I have a situation with a website and I need some opinions. Today, the structured of my site is: (I have had this site architecture since many years) Main country home (www.mysite.com.tld) o Product_1 Home (www.mysite.com.tld/product1/) § Product_1 articles www.mysite.com.tld/product1/product1_art1 www.mysite.com.tld/product1/product1_art2 www.mysite.com.tld/product1/product1_artx o Product_2 Home (www.mysite.com.tld/product2/) § Product_2 articles www.mysite.com.tld/product1/product2_art1 www.mysite.com.tld/product1/product2_art2 www.mysite.com.tld/product1/product2_artx I have several TLDs with their main and their products. We are thinking in modify this structure and begin to use subdomains for each product (The IT guys need this approach because is simpler to distribute the servers load). I not very friendly with subdomains and big changes like this always can produce some problem (although the SEO migration would be ok, problems could appear, like ranking drops), But, the solution (the reasons are technical stuff), requires the mix of directories and subdomains in each product, leaving the structured in this way: Main country home (www.mysite.com.tld) o Product_1 Home (www.mysite.com.tld/product1/) § Product_1 articles product1.mysite.com.tld/product1_art1 product1.mysite.com.tld/product1_art2 product1.mysite.com.tld/product1_artx o Product_2 Home (www.mysite.com.tld/product2/) § Product_2 articles product2.mysite.com.tld/product1_art1 product2.mysite.com.tld/product1_art2 product2.mysite.com.tld/product1_artx So, the product home will be in a directory buy the pages of the articles of this product will be in a subdomain. What do you think about this solution? Beyond that the SEO migration would be fine, 301s, etc, can bring us difficulties in the rankings or the change can be done without any consideration? Thanks very much! Agustin
Intermediate & Advanced SEO | | SEOTeamDespegar0 -
Recommended URL Structure
Hello, We are currently adding a new section of content on our site related to Marketing and more specifically 'Digital Marketing' (research reports, trend studies, etc). Over time (several months, or 1-3 years) we will add more 'general' marketing content. My question is which of the following URL structures makes more sense from an SEO perspective (and how best to quantify the benefit of one over another): www.mysite.com/marketing/digital/research/... www.mysite.com/digital-marketing/research/.. Thanks, Mike
Intermediate & Advanced SEO | | mike-gart0 -
Short Url vs Medium Urls ?
Hello Moooooooooooz ! I got a SEO fight today and though the best would be to involve more people into the fight ! 😛 Do you think it's better to get A- company.com/services/service1.html or B- company/service1.html I was for A as services is also googled to find the service1. I also think that it's better to help google to understand where the service is on the website My friend was for B as URL has to stay as short as possible What do you think ? ps: I can create the URL I want using Joomla and Sh404. The websites has 4 different categoies: /about, /services/ products, /projects Tks ! 🙂
Intermediate & Advanced SEO | | AymanH0 -
Indexed non existent pages, problem appeared after we 301d the url/index to the url.
I recently read that if a site has 2 pages that are live such as: http://www.url.com/index and http://www.url.com/ will come up as duplicate if they are both live... I read that it's best to 301 redirect the http://www.url.com/index and http://www.url.com/. I read that this helps avoid duplicate content and keep all the link juice on one page. We did the 301 for one of our clients and we got about 20,000 errors that did not exist. The errors are of pages that are indexed but do not exist on the server. We are assuming that these indexed (nonexistent) pages are somehow linked to the http://www.url.com/index The links are showing 200 OK. We took off the 301 redirect from the http://www.url.com/index page however now we still have 2 exaact pages, www.url.com/index and http://www.url.com/. What is the best way to solve this issue?
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Blocking Pages Via Robots, Can Images On Those Pages Be Included In Image Search
Hi! I have pages within my forum where visitors can upload photos. When they upload photos they provide a simple statement about the photo but no real information about the image,definitely not enough for the page to be deemed worthy of being indexed. The industry however is one that really leans on images and having the images in Google Image search is important to us. The url structure is like such: domain.com/community/photos/~username~/picture111111.aspx I wish to block the whole folder from Googlebot to prevent these low quality pages from being added to Google's main SERP results. This would be something like this: User-agent: googlebot Disallow: /community/photos/ Can I disallow Googlebot specifically rather than just using User-agent: * which would then allow googlebot-image to pick up the photos? I plan on configuring a way to add meaningful alt attributes and image names to assist in visibility, but the actual act of blocking the pages and getting the images picked up... Is this possible? Thanks! Leona
Intermediate & Advanced SEO | | HD_Leona0 -
How to let Search engines index login-first SNS sites?
What's the Effective way to let major search engine to index Login-first SNS sites? the reason of asking that is because i saw a search engines index Millon of SNS pages but most of them requested to login, how search engine get through this? http://www.baidu.com/s?wd=site%3Akaixin001.com&pn=50 thanks Boson
Intermediate & Advanced SEO | | DarwinChinaSEO0